859 resultados para big-box retailing
Resumo:
This paper considers the role of CCTV (closed circuit television) in the surveillance, policing and control of public space in urban and rural locations, specifically in relation to the use of public space by young people. The use of CCTV technology in public spaces is now an established and largely uncontested feature of everyday life in a number of countries and the assertion that they are essentially there for the protection of law abiding and consuming citizens has broadly gone unchallenged. With little or no debate in the U.K. to critique the claims made by the burgeoning security industry that CCTV protects people in the form of a ‘Big Friend’, the state at both central and local levels has endorsed the installation of CCTV apparatus across the nation. Some areas assert in their promotional material that the centre of the shopping and leisure zone is fully surveilled by cameras in order to reassure visitors that their personal safety is a matter of civic concern, with even small towns and villages expending monies on sophisticated and expensive to maintain camera systems. It is within a context of monitoring, recording and control procedures that young people’s use of public space is constructed as a threat to social order, in need of surveillance and exclusion which forms a major and contemporary feature in shaping thinking about urban and rural working class young people in the U.K. As Loader (1996) notes, young people’s claims on public space rarely gain legitimacy if ‘colliding’ with those of local residents, and Davis (1990) describes the increasing ‘militarization and destruction of public space’, while Jacobs (1965) asserts that full participation in the ‘daily life of urban streets’ is essential to the development of young people and beneficial for all who live in an area. This paper challenges the uncritical acceptance of widespread use of CCTV and identifies its oppressive and malevolent potential in forming a ‘surveillance gaze’ over young people (adapting Foucault’s ‘clinical gaze’c. 1973) which can jeopardise mental health and well being in coping with the ‘metropolis’, after Simmel, (1964).
Resumo:
In this submission, we provide evidence for our view that copyright policy in the UK must encourage new digital business models which meet the changing needs of consumers and foster innovation in the UK both within, and beyond, the creative industries. We illustrate our arguments using evidence from the music industry. However, we believe that our key points on the relationship between the copyright system and innovative digital business models apply across the UK creative industries.
Resumo:
Big data is big news in almost every sector including crisis communication. However, not everyone has access to big data and even if we have access to big data, we often do not have necessary tools to analyze and cross reference such a large data set. Therefore this paper looks at patterns in small data sets that we have ability to collect with our current tools to understand if we can find actionable information from what we already have. We have analyzed 164390 tweets collected during 2011 earthquake to find out what type of location specific information people mention in their tweet and when do they talk about that. Based on our analysis we find that even a small data set that has far less data than a big data set can be useful to find priority disaster specific areas quickly.
Resumo:
Summary: More than ever before contemporary societies are characterised by the huge amounts of data being transferred. Authorities, companies, academia and other stakeholders refer to Big Data when discussing the importance of large and complex datasets and developing possible solutions for their use. Big Data promises to be the next frontier of innovation for institutions and individuals, yet it also offers possibilities to predict and influence human behaviour with ever-greater precision
Resumo:
In most developing countries, the overall quality of the livelihood of labourers, work place environment and implementation of labour rights do not progress at the same rate as their industrial development. To address this situation, the ILO has initiated the concept of 'decent work' to assist regulators articulate labour-related social policy goals. Against this backdrop, this article assesses the Bangladesh Labour Law 2006 by reference to the four social principles developed by the ILO for ensuring 'decent work'. It explains the impact of the absence of these principles in this Law on the labour administration in the ready-made garment and ship-breaking industries. It finds that an appropriate legislative framework needs to be based on the principles of 'decent work' to establish a solid platform for a sound labour regulation in Bangladesh.
Resumo:
Big Data is a rising IT trend similar to cloud computing, social networking or ubiquitous computing. Big Data can offer beneficial scenarios in the e-health arena. However, one of the scenarios can be that Big Data needs to be kept secured for a long period of time in order to gain its benefits such as finding cures for infectious diseases and protecting patient privacy. From this connection, it is beneficial to analyse Big Data to make meaningful information while the data is stored securely. Therefore, the analysis of various database encryption techniques is essential. In this study, we simulated 3 types of technical environments, namely, Plain-text, Microsoft Built-in Encryption, and custom Advanced Encryption Standard, using Bucket Index in Data-as-a-Service. The results showed that custom AES-DaaS has a faster range query response time than MS built-in encryption. Furthermore, while carrying out the scalability test, we acknowledged that there are performance thresholds depending on physical IT resources. Therefore, for the purpose of efficient Big Data management in eHealth it is noteworthy to examine their scalability limits as well even if it is under a cloud computing environment. In addition, when designing an e-health database, both patient privacy and system performance needs to be dealt as top priorities.
Resumo:
Objective Describe the characteristics of patients presenting to Emergency Departments (EDs) within Queensland, Australia with injuries due to assault with a glass implement (‘glassing’) and to set this within the broader context of presentations due to alcohol-related violence. Methods Analysis of prospectively collected ED injury surveillance data collated by the Queensland Injury Surveillance Unit (QISU) between 1999 and 2011. Cases of injury due to alcohol-related violence were identified and analysed using coded fields supplemented with qualitative data contained within the injury description text. Descriptive statistics were used to assess the characteristics of injury presentations due to alcohol-related violence. Violence included interpersonal violence and aggression (verbal aggression and object violence). Results A total of 4629 cases were studied. The study population was predominantly male (72%) and aged 18 to 24 (36%), with males in this age group comprising more than a quarter of the study population (28%). Nine percent of alcohol-related assault injuries were a consequence of ‘glassing’. The home was the most common location for alcohol-related violence (31%) and alcohol-related ‘glassings’ (33%). Overall, the most common glass object involved was a bottle (75%), however, within licensed venues an even mix of a drinking glass (44%) and glass bottle (45%) were identified. Conclusions Contrary to public perception generated by media, ‘glassing’ incidents, particularly at licensed venues, constitute a relatively small proportion of all alcohol-related violence. The current study highlights the predominance of young men injured following alcohol-related violence, demonstrating a key focus area within the population for aiming prevention strategies.
Resumo:
The T-box family transcription factor gene TBX20 acts in a conserved regulatory network, guiding heart formation and patterning in diverse species. Mouse Tbx20 is expressed in cardiac progenitor cells, differentiating cardiomyocytes, and developing valvular tissue, and its deletion or RNA interference-mediated knockdown is catastrophic for heart development. TBX20 interacts physically, functionally, and genetically with other cardiac transcription factors, including NKX2-5, GATA4, and TBX5, mutations of which cause congenital heart disease (CHD). Here, we report nonsense (Q195X) and missense (I152M) germline mutations within the T-box DNA-binding domain of human TBX20 that were associated with a family history of CHD and a complex spectrum of developmental anomalies, including defects in septation, chamber growth, and valvulogenesis. Biophysical characterization of wild-type and mutant proteins indicated how the missense mutation disrupts the structure and function of the TBX20 T-box. Dilated cardiomyopathy was a feature of the TBX20 mutant phenotype in humans and mice, suggesting that mutations in developmental transcription factors can provide a sensitized template for adult-onset heart disease. Our findings are the first to link TBX20 mutations to human pathology. They provide insights into how mutation of different genes in an interactive regulatory circuit lead to diverse clinical phenotypes, with implications for diagnosis, genetic screening, and patient follow-up.
Resumo:
This paper describes a study of the theoretical and experimental behaviour of box-columns of varying b/t ratios under loadings of axial compression and torsion and their combinations. Details of the testing rigs and the testing methods, the results obtained such as the load-deflection curves and the interaction diagrams, and experimental observations regarding the behaviour of box-models and the types of local plastic mechanisms associated with each type of loading are presented. A simplified rigid-plastic analysis is carried out to study the collapse behaviour of box-columns under these loadings, based on the observed plastic mechanisms, and the results are compared with those of experiments.
Resumo:
The promise of ‘big data’ has generated a significant deal of interest in the development of new approaches to research in the humanities and social sciences, as well as a range of important critical interventions which warn of an unquestioned rush to ‘big data’. Drawing on the experiences made in developing innovative ‘big data’ approaches to social media research, this paper examines some of the repercussions for the scholarly research and publication practices of those researchers who do pursue the path of ‘big data’–centric investigation in their work. As researchers import the tools and methods of highly quantitative, statistical analysis from the ‘hard’ sciences into computational, digital humanities research, must they also subscribe to the language and assumptions underlying such ‘scientificity’? If so, how does this affect the choices made in gathering, processing, analysing, and disseminating the outcomes of digital humanities research? In particular, is there a need to rethink the forms and formats of publishing scholarly work in order to enable the rigorous scrutiny and replicability of research outcomes?
Resumo:
This thesis examined the possible role of Y-box binding protein 1 (YBX1) in prostate cancer aggression and spread. Novel roles were uncovered for YBX1 in the regulation of several genes previously implicated in prostate cancer, as well as showing an effect for YBX1 in increasing tumour cell invasion and movement and reciprocal regulation of androgen-regulated gene networks. In addition, it was found that Y-box 1 regulated several other well-known cancer genes implicated in breast and other cancers. The work performed in this thesis has strengthened the foundations for pursuing YBX1 as a possible central target molecule in prostate cancer therapeutics.
Resumo:
Acoustic sensing is a promising approach to scaling faunal biodiversity monitoring. Scaling the analysis of audio collected by acoustic sensors is a big data problem. Standard approaches for dealing with big acoustic data include automated recognition and crowd based analysis. Automatic methods are fast at processing but hard to rigorously design, whilst manual methods are accurate but slow at processing. In particular, manual methods of acoustic data analysis are constrained by a 1:1 time relationship between the data and its analysts. This constraint is the inherent need to listen to the audio data. This paper demonstrates how the efficiency of crowd sourced sound analysis can be increased by an order of magnitude through the visual inspection of audio visualized as spectrograms. Experimental data suggests that an analysis speedup of 12× is obtainable for suitable types of acoustic analysis, given that only spectrograms are shown.
Resumo:
Big Data presents many challenges related to volume, whether one is interested in studying past datasets or, even more problematically, attempting to work with live streams of data. The most obvious challenge, in a ‘noisy’ environment such as contemporary social media, is to collect the pertinent information; be that information for a specific study, tweets which can inform emergency services or other responders to an ongoing crisis, or give an advantage to those involved in prediction markets. Often, such a process is iterative, with keywords and hashtags changing with the passage of time, and both collection and analytic methodologies need to be continually adapted to respond to this changing information. While many of the data sets collected and analyzed are preformed, that is they are built around a particular keyword, hashtag, or set of authors, they still contain a large volume of information, much of which is unnecessary for the current purpose and/or potentially useful for future projects. Accordingly, this panel considers methods for separating and combining data to optimize big data research and report findings to stakeholders. The first paper considers possible coding mechanisms for incoming tweets during a crisis, taking a large stream of incoming tweets and selecting which of those need to be immediately placed in front of responders, for manual filtering and possible action. The paper suggests two solutions for this, content analysis and user profiling. In the former case, aspects of the tweet are assigned a score to assess its likely relationship to the topic at hand, and the urgency of the information, whilst the latter attempts to identify those users who are either serving as amplifiers of information or are known as an authoritative source. Through these techniques, the information contained in a large dataset could be filtered down to match the expected capacity of emergency responders, and knowledge as to the core keywords or hashtags relating to the current event is constantly refined for future data collection. The second paper is also concerned with identifying significant tweets, but in this case tweets relevant to particular prediction market; tennis betting. As increasing numbers of professional sports men and women create Twitter accounts to communicate with their fans, information is being shared regarding injuries, form and emotions which have the potential to impact on future results. As has already been demonstrated with leading US sports, such information is extremely valuable. Tennis, as with American Football (NFL) and Baseball (MLB) has paid subscription services which manually filter incoming news sources, including tweets, for information valuable to gamblers, gambling operators, and fantasy sports players. However, whilst such services are still niche operations, much of the value of information is lost by the time it reaches one of these services. The paper thus considers how information could be filtered from twitter user lists and hash tag or keyword monitoring, assessing the value of the source, information, and the prediction markets to which it may relate. The third paper examines methods for collecting Twitter data and following changes in an ongoing, dynamic social movement, such as the Occupy Wall Street movement. It involves the development of technical infrastructure to collect and make the tweets available for exploration and analysis. A strategy to respond to changes in the social movement is also required or the resulting tweets will only reflect the discussions and strategies the movement used at the time the keyword list is created — in a way, keyword creation is part strategy and part art. In this paper we describe strategies for the creation of a social media archive, specifically tweets related to the Occupy Wall Street movement, and methods for continuing to adapt data collection strategies as the movement’s presence in Twitter changes over time. We also discuss the opportunities and methods to extract data smaller slices of data from an archive of social media data to support a multitude of research projects in multiple fields of study. The common theme amongst these papers is that of constructing a data set, filtering it for a specific purpose, and then using the resulting information to aid in future data collection. The intention is that through the papers presented, and subsequent discussion, the panel will inform the wider research community not only on the objectives and limitations of data collection, live analytics, and filtering, but also on current and in-development methodologies that could be adopted by those working with such datasets, and how such approaches could be customized depending on the project stakeholders.
Resumo:
Modern health information systems can generate several exabytes of patient data, the so called "Health Big Data", per year. Many health managers and experts believe that with the data, it is possible to easily discover useful knowledge to improve health policies, increase patient safety and eliminate redundancies and unnecessary costs. The objective of this paper is to discuss the characteristics of Health Big Data as well as the challenges and solutions for health Big Data Analytics (BDA) – the process of extracting knowledge from sets of Health Big Data – and to design and evaluate a pipelined framework for use as a guideline/reference in health BDA.
Resumo:
Environmental monitoring is becoming critical as human activity and climate change place greater pressures on biodiversity, leading to an increasing need for data to make informed decisions. Acoustic sensors can help collect data across large areas for extended periods making them attractive in environmental monitoring. However, managing and analysing large volumes of environmental acoustic data is a great challenge and is consequently hindering the effective utilization of the big dataset collected. This paper presents an overview of our current techniques for collecting, storing and analysing large volumes of acoustic data efficiently, accurately, and cost-effectively.