986 resultados para BIG-BANG NUCLEOSYNTHESIS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Big Data is a rising IT trend similar to cloud computing, social networking or ubiquitous computing. Big Data can offer beneficial scenarios in the e-health arena. However, one of the scenarios can be that Big Data needs to be kept secured for a long period of time in order to gain its benefits such as finding cures for infectious diseases and protecting patient privacy. From this connection, it is beneficial to analyse Big Data to make meaningful information while the data is stored securely. Therefore, the analysis of various database encryption techniques is essential. In this study, we simulated 3 types of technical environments, namely, Plain-text, Microsoft Built-in Encryption, and custom Advanced Encryption Standard, using Bucket Index in Data-as-a-Service. The results showed that custom AES-DaaS has a faster range query response time than MS built-in encryption. Furthermore, while carrying out the scalability test, we acknowledged that there are performance thresholds depending on physical IT resources. Therefore, for the purpose of efficient Big Data management in eHealth it is noteworthy to examine their scalability limits as well even if it is under a cloud computing environment. In addition, when designing an e-health database, both patient privacy and system performance needs to be dealt as top priorities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective Describe the characteristics of patients presenting to Emergency Departments (EDs) within Queensland, Australia with injuries due to assault with a glass implement (‘glassing’) and to set this within the broader context of presentations due to alcohol-related violence. Methods Analysis of prospectively collected ED injury surveillance data collated by the Queensland Injury Surveillance Unit (QISU) between 1999 and 2011. Cases of injury due to alcohol-related violence were identified and analysed using coded fields supplemented with qualitative data contained within the injury description text. Descriptive statistics were used to assess the characteristics of injury presentations due to alcohol-related violence. Violence included interpersonal violence and aggression (verbal aggression and object violence). Results A total of 4629 cases were studied. The study population was predominantly male (72%) and aged 18 to 24 (36%), with males in this age group comprising more than a quarter of the study population (28%). Nine percent of alcohol-related assault injuries were a consequence of ‘glassing’. The home was the most common location for alcohol-related violence (31%) and alcohol-related ‘glassings’ (33%). Overall, the most common glass object involved was a bottle (75%), however, within licensed venues an even mix of a drinking glass (44%) and glass bottle (45%) were identified. Conclusions Contrary to public perception generated by media, ‘glassing’ incidents, particularly at licensed venues, constitute a relatively small proportion of all alcohol-related violence. The current study highlights the predominance of young men injured following alcohol-related violence, demonstrating a key focus area within the population for aiming prevention strategies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The promise of ‘big data’ has generated a significant deal of interest in the development of new approaches to research in the humanities and social sciences, as well as a range of important critical interventions which warn of an unquestioned rush to ‘big data’. Drawing on the experiences made in developing innovative ‘big data’ approaches to social media research, this paper examines some of the repercussions for the scholarly research and publication practices of those researchers who do pursue the path of ‘big data’–centric investigation in their work. As researchers import the tools and methods of highly quantitative, statistical analysis from the ‘hard’ sciences into computational, digital humanities research, must they also subscribe to the language and assumptions underlying such ‘scientificity’? If so, how does this affect the choices made in gathering, processing, analysing, and disseminating the outcomes of digital humanities research? In particular, is there a need to rethink the forms and formats of publishing scholarly work in order to enable the rigorous scrutiny and replicability of research outcomes?

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sweden’s protest against the Vietnam War was given tangible form in 1969 through the decision to give economic aid to the Government of North Vietnam. The main outcome was an integrated pulp and paper mill in the Vinh Phu Province north-west of Hanoi. Known as Bai Bang after its location, the mill became the most costly, one of the longest lasting and the most controversial project in the history of Swedish development cooperation. In 1996 Bai Bang produced at its full capacity. Today the mill is exclusively managed and staffed by the Vietnamese and there are plans for future expansion. At the same time a substantial amount of money has been spent to reach these achievements. Looking back at the cumbersome history of the project the results are against many’s expectations. To learn more about the conditions for sustainable development Sida commissioned two studies of the Bai Bang project. Together they touch upon several important issues in development cooperation over a period of almost 30 years: the change of aid paradigms over time, the role of foreign policy in development cooperation, cultural obstacles, recipient responsibility versus donor led development etc. The two studies were commissioned by Sida’s Department for Evaluation and Internal Audit which is an independent department reporting directly to Sida’s Board of Directors. One study assesses the financial and economic viability of the pulp and paper mill and the broader development impact of the project in Vietnam. It has been carried out by the Centre for International Economics, an Australian private economic research agency. The other study analyses the decision-making processes that created and shaped the project over a period of two decades, and reflects on lessons from the project for development cooperation in general. This study has been carried out by the Chr. Michelsen Institute, a Norweigan independent research institution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Acoustic sensing is a promising approach to scaling faunal biodiversity monitoring. Scaling the analysis of audio collected by acoustic sensors is a big data problem. Standard approaches for dealing with big acoustic data include automated recognition and crowd based analysis. Automatic methods are fast at processing but hard to rigorously design, whilst manual methods are accurate but slow at processing. In particular, manual methods of acoustic data analysis are constrained by a 1:1 time relationship between the data and its analysts. This constraint is the inherent need to listen to the audio data. This paper demonstrates how the efficiency of crowd sourced sound analysis can be increased by an order of magnitude through the visual inspection of audio visualized as spectrograms. Experimental data suggests that an analysis speedup of 12× is obtainable for suitable types of acoustic analysis, given that only spectrograms are shown.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Big Data presents many challenges related to volume, whether one is interested in studying past datasets or, even more problematically, attempting to work with live streams of data. The most obvious challenge, in a ‘noisy’ environment such as contemporary social media, is to collect the pertinent information; be that information for a specific study, tweets which can inform emergency services or other responders to an ongoing crisis, or give an advantage to those involved in prediction markets. Often, such a process is iterative, with keywords and hashtags changing with the passage of time, and both collection and analytic methodologies need to be continually adapted to respond to this changing information. While many of the data sets collected and analyzed are preformed, that is they are built around a particular keyword, hashtag, or set of authors, they still contain a large volume of information, much of which is unnecessary for the current purpose and/or potentially useful for future projects. Accordingly, this panel considers methods for separating and combining data to optimize big data research and report findings to stakeholders. The first paper considers possible coding mechanisms for incoming tweets during a crisis, taking a large stream of incoming tweets and selecting which of those need to be immediately placed in front of responders, for manual filtering and possible action. The paper suggests two solutions for this, content analysis and user profiling. In the former case, aspects of the tweet are assigned a score to assess its likely relationship to the topic at hand, and the urgency of the information, whilst the latter attempts to identify those users who are either serving as amplifiers of information or are known as an authoritative source. Through these techniques, the information contained in a large dataset could be filtered down to match the expected capacity of emergency responders, and knowledge as to the core keywords or hashtags relating to the current event is constantly refined for future data collection. The second paper is also concerned with identifying significant tweets, but in this case tweets relevant to particular prediction market; tennis betting. As increasing numbers of professional sports men and women create Twitter accounts to communicate with their fans, information is being shared regarding injuries, form and emotions which have the potential to impact on future results. As has already been demonstrated with leading US sports, such information is extremely valuable. Tennis, as with American Football (NFL) and Baseball (MLB) has paid subscription services which manually filter incoming news sources, including tweets, for information valuable to gamblers, gambling operators, and fantasy sports players. However, whilst such services are still niche operations, much of the value of information is lost by the time it reaches one of these services. The paper thus considers how information could be filtered from twitter user lists and hash tag or keyword monitoring, assessing the value of the source, information, and the prediction markets to which it may relate. The third paper examines methods for collecting Twitter data and following changes in an ongoing, dynamic social movement, such as the Occupy Wall Street movement. It involves the development of technical infrastructure to collect and make the tweets available for exploration and analysis. A strategy to respond to changes in the social movement is also required or the resulting tweets will only reflect the discussions and strategies the movement used at the time the keyword list is created — in a way, keyword creation is part strategy and part art. In this paper we describe strategies for the creation of a social media archive, specifically tweets related to the Occupy Wall Street movement, and methods for continuing to adapt data collection strategies as the movement’s presence in Twitter changes over time. We also discuss the opportunities and methods to extract data smaller slices of data from an archive of social media data to support a multitude of research projects in multiple fields of study. The common theme amongst these papers is that of constructing a data set, filtering it for a specific purpose, and then using the resulting information to aid in future data collection. The intention is that through the papers presented, and subsequent discussion, the panel will inform the wider research community not only on the objectives and limitations of data collection, live analytics, and filtering, but also on current and in-development methodologies that could be adopted by those working with such datasets, and how such approaches could be customized depending on the project stakeholders.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modern health information systems can generate several exabytes of patient data, the so called "Health Big Data", per year. Many health managers and experts believe that with the data, it is possible to easily discover useful knowledge to improve health policies, increase patient safety and eliminate redundancies and unnecessary costs. The objective of this paper is to discuss the characteristics of Health Big Data as well as the challenges and solutions for health Big Data Analytics (BDA) – the process of extracting knowledge from sets of Health Big Data – and to design and evaluate a pipelined framework for use as a guideline/reference in health BDA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Environmental monitoring is becoming critical as human activity and climate change place greater pressures on biodiversity, leading to an increasing need for data to make informed decisions. Acoustic sensors can help collect data across large areas for extended periods making them attractive in environmental monitoring. However, managing and analysing large volumes of environmental acoustic data is a great challenge and is consequently hindering the effective utilization of the big dataset collected. This paper presents an overview of our current techniques for collecting, storing and analysing large volumes of acoustic data efficiently, accurately, and cost-effectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Enterprises, both public and private, have rapidly commenced using the benefits of enterprise resource planning (ERP) combined with business analytics and “open data sets” which are often outside the control of the enterprise to gain further efficiencies, build new service operations and increase business activity. In many cases, these business activities are based around relevant software systems hosted in a “cloud computing” environment. “Garbage in, garbage out”, or “GIGO”, is a term long used to describe problems in unqualified dependency on information systems, dating from the 1960s. However, a more pertinent variation arose sometime later, namely “garbage in, gospel out” signifying that with large scale information systems, such as ERP and usage of open datasets in a cloud environment, the ability to verify the authenticity of those data sets used may be almost impossible, resulting in dependence upon questionable results. Illicit data set “impersonation” becomes a reality. At the same time the ability to audit such results may be an important requirement, particularly in the public sector. This paper discusses the need for enhancement of identity, reliability, authenticity and audit services, including naming and addressing services, in this emerging environment and analyses some current technologies that are offered and which may be appropriate. However, severe limitations to addressing these requirements have been identified and the paper proposes further research work in the area.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Design Minds The Big Picture Toolkit was one of six K7-12 secondary school design toolkits commissioned by the State Library of Queensland (SLQ) Asia Pacific Design Library (APDL), to facilitate the delivery of the Stage 1 launch of its Design Minds online platform (www.designminds.org.au) partnership initiative with Queensland Government Arts Queensland and the Smithsonian Cooper-Hewitt National Design Museum, on June 29, 2012. Design Minds toolkits are practical guides, underpinned by a combination of one to three of the Design Minds model phases of ‘Inquire’, ‘Ideate’ and ‘Implement’ (supported by at each stage with structured reflection), to enhance existing school curriculum and empower students with real life design exercises, within the classroom environment. Toolkits directly identify links to Naplan, National Curriculum, C2C and Professional Standards benchmarks, as well as the student capabilities of successful and creative 21st century citizens they seek to engender through design thinking. Inspired by the Unlimited: Designing for the Asia Pacific Generation Workshop 2010 (http://eprints.qut.edu.au/47762/), this toolkit explores, through three distinct exercises, ‘design for the other 90%’, addressing tools and approaches to diverse and changing social, cultural, technological and environmental challenges. The Design Minds The Big Picture Toolkit challenges students to be active agents for change and to think creatively and optimistically about solutions to future global issues that deliver social, economic and environmental benefits. More generally, it aims to facilitate awareness in young people, of the role of design in society and the value of design thinking skills in generating strategies to solve basic to complex systemic challenges, as well as to inspire post-secondary pathways and idea generation for education. The toolkit encourages students and teachers to develop sketching, making, communication, presentation and collaboration skills to improve their design process, as well as explore further inquiry (background research) to enhance the ideation exercises. Exercise 1 focuses on the ‘Inquire’ phase, Exercise 2 the ‘Inquire’ and ‘Ideate’ phases, and Exercise 3 concentrates on the ‘Implement’ phase. Depending on the intensity of the focus, the unit of work could be developed over a 4-5 week program (approximately 4-6 x 60 minute lessons/workshops) or as smaller workshops treated as discrete learning experiences. The toolkit is available for public download from http://designminds.org.au/the-big-picture/ on the Design Minds website.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Urban space has the potential to shape people's experience and understanding of the city and of the culture of a place. In some respects, murals and allied forms of wall art occupy the intersection of street art and public art; engaging, and sometimes, transforming the urban space in which they exist and those who use it. While murals are often conceived as a more ‘permanent’ form of painted art there has been a trend in recent years towards more deliberately transient forms of wall art such as washed-wall murals and reverse graffiti. These varying forms of public wall art are embedded within the fabric of the urban space and history. This paper will explore the intersection of public space, public art and public memory in a mural project in the Irish city of Cork. Focussing on the washed-wall murals of Cork's historic Shandon district, we explore the sympathetic and synergetic relationship of this wall art with the heritage architecture of the built environment and of the murals as an expression of and for the local community, past and present. Through the Shandon Big Wash Up murals we reflect on the function of participatory public art as an explicit act of urban citizenship which works to support community-led re-enchantment in the city through a reconnection with its past.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gel dosimetry and plastic chemical dosimeters such as PresageTM are capable of very accurately mapping dose distributions in three dimensions. Combined with their near tissue equivalence one would expect that after several decades of development they would be the dosimeter of choice for dosimetry, however they have not achieve widespread clinical use. This presentation will include a brief description and history of developments in gels and 3D plastics for dosimetry, the limitations and advantages, and their role in the future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Big data is certainly the buzz term in executive networking circles at the moment. Heralded by management consultancies and research organisations alike as the next big thing in business efficiency, it is shooting up the Gartner hype cycle to the giddy heights of the peak of inflated expectations before it tumbles down in to the trough of disillusionment

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One cannot help but be impressed by the inroads that digital oilfield technologies have made into the exploration and production (E&P) industry in the past decade. Today’s production systems can be monitored by “smart” sensors that allow engineers to observe almost any aspect of performance in real time. Our understanding of how reservoirs are behaving has improved considerably since the dawn of this revolution, and the industry has been able to move away from point answers to more holistic “big picture” integrated solutions. Indeed, the industry has already reaped the rewards of many of these kinds of investments. Many billions of dollars of value have been delivered by this heightened awareness of what is going on within our assets and the world around them (Van Den Berg et al. 2010).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Metaphors are a common instrument of human cognition, activated when seeking to make sense of novel and abstract phenomena. In this article we assess some of the values and assumptions encoded in the framing of the term big data, drawing on the framework of conceptual metaphor. We first discuss the terms data and big data and the meanings historically attached to them by different usage communities and then proceed with a discourse analysis of Internet news items about big data. We conclude by characterizing two recurrent framings of the concept: as a natural force to be controlled and as a resource to be consumed.