124 resultados para big data storage
Resumo:
Arts education research, as an interdisciplinary field, has developed in the shadows of a number of research traditions. However amid all the methodological innovation, I believe there is one particular, distinctive and radical research strategy which arts educators have created to research the practice of arts education: namely arts-based research. For many, and Elliot Eisner from Stanford University was among the first, arts education needed a research approach which could deal with the complex dynamics of arts education in the classroom. What was needed was ‘an approach to the conduct of educational research that was rooted in the arts and that used aesthetically crafted forms to reveal aspects of practice that mattered educationally’ (Eisner 2006: 11). While arts education researchers were crafting the principles and practices of arts-based research, fellow artist/researchers in the creative arts were addressing similar needs and fashioning their own exacting research strategies. This chapter aligns arts-based research with the complementary research practices established in creative arts studios and identifies the shared and truly radical nature of these moves. Finally, and in a contemporary turn many will find surprising, I will discuss how the radical aspects of these methodologies are now being held up as core elements of what is being called the fourth paradigm of scientific research, known as eScience. Could it be that the radical dynamics of arts-based research pre-figured the needs of eScience researchers who are currently struggling to manage the ‘deluge of Big Data’ which is disrupting their well-established scientific methods?
Resumo:
Big Data and Learning Analytics’ promise to revolutionise educational institutions, endeavours, and actions through more and better data is now compelling. Multiple, and continually updating, data sets produce a new sense of ‘personalised learning’. A crucial attribute of the datafication, and subsequent profiling, of learner behaviour and engagement is the continual modification of the learning environment to induce greater levels of investment on the parts of each learner. The assumption is that more and better data, gathered faster and fed into ever-updating algorithms, provide more complete tools to understand, and therefore improve, learning experiences through adaptive personalisation. The argument in this paper is that Learning Personalisation names a new logistics of investment as the common ‘sense’ of the school, in which disciplinary education is ‘both disappearing and giving way to frightful continual training, to continual monitoring'.
Resumo:
Queensland University of Technology (QUT) is a large multidisciplinary university located in Brisbane, Queensland, Australia. QUT is increasing its research focus and is developing its research support services. It has adopted a model of collaboration between the Library, High Performance Computing and Research Support (HPC) and more broadly with Information Technology Services (ITS). Research support services provided by the Library include the provision of information resources and discovery services, bibliographic management software, assistance with publishing (publishing strategies, identifying high impact journals, dealing with publishers and the peer review process), citation analysis and calculating authors’ H Index. Research data management services are being developed by the Library and HPC working in collaboration. The HPC group within ITS supports research computing infrastructure, research development and engagement activities, researcher consultation, high speed computation and data storage systems , 2D/ 3D (immersive) visualisation tools, parallelisation and optimization of research codes, statistics/ data modeling training and support (both qualitative and quantitative) and support for the university’s central Access Grid collaboration facility. Development and engagement activities include participation in research grants and papers, student supervision and internships and the sponsorship, incubation and adoption of new computing technologies for research. ITS also provides other services that support research including ICT training, research infrastructure (networking, data storage, federated access and authorization, virtualization) and corporate systems for research administration. Seminars and workshops are offered to increase awareness and uptake of new and existing services. A series of online surveys on eResearch practices and skills and a number of focus groups was conducted to better inform the development of research support services. Progress towards the provision of research support is described within the context organizational frameworks; resourcing; infrastructure; integration; collaboration; change management; engagement; awareness and skills; new services; and leadership. Challenges to be addressed include the need to redeploy existing operational resources toward new research support services, supporting a rapidly growing research profile across the university, the growing need for the use and support of IT in research programs, finding capacity to address the diverse research support needs across the disciplines, operationalising new research support services following their implementation in project mode, embedding new specialist staff roles, cross-skilling Liaison Librarians, and ensuring continued collaboration between stakeholders.
Resumo:
One of the main challenges of slow speed machinery condition monitoring is that the energy generated from an incipient defect is too weak to be detected by traditional vibration measurements due to its low impact energy. Acoustic emission (AE) measurement is an alternative for this as it has the ability to detect crack initiations or rubbing between moving surfaces. However, AE measurement requires high sampling frequency and consequently huge amount of data are obtained to be processed. It also requires expensive hardware to capture those data, storage and involves signal processing techniques to retrieve valuable information on the state of the machine. AE signal has been utilised for early detection of defects in bearings and gears. This paper presents an online condition monitoring (CM) system for slow speed machinery, which attempts to overcome those challenges. The system incorporates relevant signal processing techniques for slow speed CM which include noise removal techniques to enhance the signal-to-noise and peak-holding down sampling to reduce the burden of massive data handling. The analysis software works under Labview environment, which enables online remote control of data acquisition, real-time analysis, offline analysis and diagnostic trending. The system has been fully implemented on a site machine and contributing significantly to improve the maintenance efficiency and provide a safer and reliable operation.
Resumo:
Cloud computing has emerged as a major ICT trend and has been acknowledged as a key theme of industry by prominent ICT organisations. However, one of the major challenges that face the cloud computing concept and its global acceptance is how to secure and protect the data that is the property of the user. The geographic location of cloud data storage centres is an important issue for many organisations and individuals due to the regulations and laws that require data and operations to reside in specific geographic locations. Thus, data owners may need to ensure that their cloud providers do not compromise the SLA contract and move their data into another geographic location. This paper introduces an architecture for a new approach for geographic location assurance, which combines the proof of storage protocol (POS) and the distance-bounding protocol. This allows the client to check where their stored data is located, without relying on the word of the cloud provider. This architecture aims to achieve better security and more flexible geographic assurance within the environment of cloud computing.
Resumo:
In most visual mapping applications suited to Autonomous Underwater Vehicles (AUVs), stereo visual odometry (VO) is rarely utilised as a pose estimator as imagery is typically of very low framerate due to energy conservation and data storage requirements. This adversely affects the robustness of a vision-based pose estimator and its ability to generate a smooth trajectory. This paper presents a novel VO pipeline for low-overlap imagery from an AUV that utilises constrained motion and integrates magnetometer data in a bi-objective bundle adjustment stage to achieve low-drift pose estimates over large trajectories. We analyse the performance of a standard stereo VO algorithm and compare the results to the modified vo algorithm. Results are demonstrated in a virtual environment in addition to low-overlap imagery gathered from an AUV. The modified VO algorithm shows significantly improved pose accuracy and performance over trajectories of more than 300m. In addition, dense 3D meshes generated from the visual odometry pipeline are presented as a qualitative output of the solution.
Resumo:
In an increasingly business technology (BT) dependent world, the impact of the extraordinary changes brought about by the nexus of mobile and cloud technologies, social media and big data is increasingly being felt in the board room. As leaders of enterprises of every type and size, board directors can no longer afford to ignore, delegate or avoid BT-related decisions. Competitive, financial and reputational risk is increased if boards fail to recognize their role in governing technology as an asset and in removing barriers to improving enterprise business technology governance (EBTG). Directors’ awareness of the need for EBTG is increasing. However, industry research shows that board level willingness to rectify the gap between awareness and action is very low or non-existent. This literature review-based research identifies barriers to EBTG effectiveness. It provides a practical starting point for board analysis. We offer four outcomes that boards might focus on to ensure the organizations they govern are not left behind by those led by the upcoming new breed of technology-savvy leaders. Most extant research looks backward for examples, examining data pre-2010, the time when a tipping point in the personal and business use of multimedia and mobile-internet devices significantly deepened the impacts of the identified nexus technology forces, and began rapidly changing the way many businesses engage with their customers, employees and stakeholders. We situate our work amidst these nexus forces, discuss the board’s role in EBTG in this context, and modernize current definitions of enterprise technology governance. The primary limitation faced is the lack of scholarly research relating to EBTG in the rapidly changing digital economy. Although we have used recent (2011 - 2013) industry surveys, the volume of these surveys and congruence across them is significant in terms of levels of increased awareness and calls for increased board attention and competency in EBTG and strategic information use. Where possible we have used scholarly research to illustrate or discuss industry findings.
Resumo:
Now as in earlier periods of acute change in the media environment, new disciplinary articulations are producing new methods for media and communication research. At the same time, established media and communication studies meth- ods are being recombined, reconfigured, and remediated alongside their objects of study. This special issue of JOBEM seeks to explore the conceptual, political, and practical aspects of emerging methods for digital media research. It does so at the conjuncture of a number of important contemporary trends: the rise of a ‘‘third wave’’ of the Digital Humanities and the ‘‘computational turn’’ (Berry, 2011) associated with natively digital objects and the methods for studying them; the apparently ubiquitous Big Data paradigm—with its various manifestations across academia, business, and government — that brings with it a rapidly increasing interest in social media communication and online ‘‘behavior’’ from the ‘‘hard’’ sciences; along with the multisited, embodied, and emplaced nature of everyday digital media practice.
Resumo:
Although popular media narratives about the role of social media in driving the events of the 2011 “Arab Spring” are likely to overstate the impact of Facebook and Twitter on these uprisings, it is nonetheless true that protests and unrest in countries from Tunisia to Syria generated a substantial amount of social media activity. On Twitter alone, several millions of tweets containing the hashtags #libya or #egypt were generated during 2011, both by directly affected citizens of these countries and by onlookers from further afield. What remains unclear, though, is the extent to which there was any direct interaction between these two groups (especially considering potential language barriers between them). Building on hashtag data sets gathered between January and November 2011, this article compares patterns of Twitter usage during the popular revolution in Egypt and the civil war in Libya. Using custom-made tools for processing “big data,” we examine the volume of tweets sent by English-, Arabic-, and mixed-language Twitter users over time and examine the networks of interaction (variously through @replying, retweeting, or both) between these groups as they developed and shifted over the course of these uprisings. Examining @reply and retweet traffic, we identify general patterns of information flow between the English- and Arabic-speaking sides of the Twittersphere and highlight the roles played by users bridging both language spheres.
Resumo:
MapReduce is a computation model for processing large data sets in parallel on large clusters of machines, in a reliable, fault-tolerant manner. A MapReduce computation is broken down into a number of map tasks and reduce tasks, which are performed by so called mappers and reducers, respectively. The placement of the mappers and reducers on the machines directly affects the performance and cost of the MapReduce computation in cloud computing. From the computational point of view, the mappers/reducers placement problem is a generation of the classical bin packing problem, which is NP-complete. Thus, in this paper we propose a new heuristic algorithm for the mappers/reducers placement problem in cloud computing and evaluate it by comparing with other several heuristics on solution quality and computation time by solving a set of test problems with various characteristics. The computational results show that our heuristic algorithm is much more efficient than the other heuristics and it can obtain a better solution in a reasonable time. Furthermore, we verify the effectiveness of our heuristic algorithm by comparing the mapper/reducer placement for a benchmark problem generated by our heuristic algorithm with a conventional mapper/reducer placement which puts a fixed number of mapper/reducer on each machine. The comparison results show that the computation using our mapper/reducer placement is much cheaper than the computation using the conventional placement while still satisfying the computation deadline.
Resumo:
The syntheses, properties and electronic structures of a series of porphyrin dimers connected by two-atom bridges were compared. The study found that an azo linker results in the most efficient electronic communication between the two porphyrin rings, and is the superior connector for dimers, trimers and oligomers in the design of nonlinear optical materials. This has implications for the design of molecular probes and sensors, photodynamic therapy, microfabrication, and three-dimensional optical data storage. The research led to the synthesis of a number of new porphyrin monomers and dimers, which were characterised using structural, spectroscopic and spectrometric techniques.
Resumo:
Social Media Analytics ist ein neuer Forschungsbereich, in dem interdisziplinäre Methoden kombiniert, erweitert und angepasst werden, um Social-Media-Daten auszuwerten. Neben der Beantwortung von Forschungsfragen ist es ebenfalls ein Ziel, Architekturentwürfe für die Entwicklung neuer Informationssysteme und Anwendungen bereitzustellen, die auf sozialen Medien basieren. Der Beitrag stellt die wichtigsten Aspekte des Bereichs Social Media Analytics vor und verweist auf die Notwendigkeit einer fächerübergreifenden Forschungsagenda, für deren Erstellung und Bearbeitung der Wirtschaftsinformatik eine wichtige Rolle zukommt.
Resumo:
Social Media Analytics is an emerging interdisciplinary research field that aims on combining, extending, and adapting methods for analysis of social media data. On the one hand it can support IS and other research disciplines to answer their research questions and on the other hand it helps to provide architectural designs as well as solution frameworks for new social media-based applications and information systems. The authors suggest that IS should contribute to this field and help to develop and process an interdisciplinary research agenda.