932 resultados para Center Sets


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The rise of creative industries requires new thinking in communication, media and cultural studies, media and cultural policy, and the arts and information sectors. The Creative Industries, Culture and Policy sets the agenda for these debates, providing a richer understanding of the dynamics of cultural markets, creative labor, finance and risk, and how culture is distributed, marketed and creatively reused through new media technologies. This book: develops a global perspective on the creative industries and creative economy draws insights from media and cultural studies, innovation economics, cultural policy studies, and economic and cultural geography explores what it means for policy-makers when culture and creativity move from the margins to the center of economic dynamics makes extensive use of case studies in ways that are relevant not only to researchers and policy-makers, but also to the generation of students who will increasingly be establishing a ‘portfolio career’ in the creative industries. International in coverage, The Creative Industries traces the historical and contemporary ideas that make the cultural economy more relevant that it has ever been. It is essential reading for students and academics in media, communication and cultural studies. Table of Contents - Introduction - Origins of Creative Industries Policy - International Models of Creative Industries Policy - From Culture Industries to Cultural Economy - Products, Services, Production and Creative Work - Consumption, Markets, Technology and Cultural Trade - Globalization, Cities and Creative Spaces - Creative Industries and Public Policy - Conclusion

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The CCI-Creative City Index was commissioned in 2010 by the Beijing Academy of Science & Technology's Beijing Research Center for the Science of Science. John Hartley was asked to develop a new creative global city index. The brief was to improve on the existing indexes with a specific focus on creative industries and the sources of creative development. This report, by John Hartley, Jason Potts, Trent MacDonald, with Chris Erkunt and Carl Kufleitner, sets out the new model we have developed, which we call the CCI Creative City Index (CCI-CCI). It presents the results of a pilot application of the index to six cities: London, Cardiff, Berlin, Bremen, Melbourne and Brisbane. The index incorporates many elements from other global and creative city indexes, but also adds several new dimensions relating to creative industries scope, micro-productivity, and the economy of attention. The report and Excel spreadsheets of index calculations can be found on this site.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The rapid increase in the deployment of CCTV systems has led to a greater demand for algorithms that are able to process incoming video feeds. These algorithms are designed to extract information of interest for human operators. During the past several years, there has been a large effort to detect abnormal activities through computer vision techniques. Typically, the problem is formulated as a novelty detection task where the system is trained on normal data and is required to detect events which do not fit the learned `normal' model. Many researchers have tried various sets of features to train different learning models to detect abnormal behaviour in video footage. In this work we propose using a Semi-2D Hidden Markov Model (HMM) to model the normal activities of people. The outliers of the model with insufficient likelihood are identified as abnormal activities. Our Semi-2D HMM is designed to model both the temporal and spatial causalities of the crowd behaviour by assuming the current state of the Hidden Markov Model depends not only on the previous state in the temporal direction, but also on the previous states of the adjacent spatial locations. Two different HMMs are trained to model both the vertical and horizontal spatial causal information. Location features, flow features and optical flow textures are used as the features for the model. The proposed approach is evaluated using the publicly available UCSD datasets and we demonstrate improved performance compared to other state of the art methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A review of 291 catalogued particles on the bases of particle size, shape, bulk chemistry, and texture is used to establish a reliable taxonomy. Extraterrestrial materials occur in three defined categories: spheres, aggregates and fragments. Approximately 76% of aggregates are of probable extraterrestrial origin, whereas spheres contain the smallest amount of extraterrestrial material (approx 43%). -B.M.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Big data is big news in almost every sector including crisis communication. However, not everyone has access to big data and even if we have access to big data, we often do not have necessary tools to analyze and cross reference such a large data set. Therefore this paper looks at patterns in small data sets that we have ability to collect with our current tools to understand if we can find actionable information from what we already have. We have analyzed 164390 tweets collected during 2011 earthquake to find out what type of location specific information people mention in their tweet and when do they talk about that. Based on our analysis we find that even a small data set that has far less data than a big data set can be useful to find priority disaster specific areas quickly.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper evaluates the efficiency of a number of popular corpus-based distributional models in performing discovery on very large document sets, including online collections. Literature-based discovery is the process of identifying previously unknown connections from text, often published literature, that could lead to the development of new techniques or technologies. Literature-based discovery has attracted growing research interest ever since Swanson's serendipitous discovery of the therapeutic effects of fish oil on Raynaud's disease in 1986. The successful application of distributional models in automating the identification of indirect associations underpinning literature-based discovery has been heavily demonstrated in the medical domain. However, we wish to investigate the computational complexity of distributional models for literature-based discovery on much larger document collections, as they may provide computationally tractable solutions to tasks including, predicting future disruptive innovations. In this paper we perform a computational complexity analysis on four successful corpus-based distributional models to evaluate their fit for such tasks. Our results indicate that corpus-based distributional models that store their representations in fixed dimensions provide superior efficiency on literature-based discovery tasks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Wide-Area Measurement Systems (WAMS) provide the opportunity of utilizing remote signals from different locations for the enhancement of power system stability. This paper focuses on the implementation of remote measurements as supplementary signals for off-center Static Var Compensators (SVCs) to damp inter-area oscillations. Combination of participation factor and residue method is used for the selection of most effective stabilizing signal. Speed difference of two generators from separate areas is identified as the best stabilizing signal and used as a supplementary signal for lead-lag controller of SVCs. Time delays of remote measurements and control signals is considered. Wide-Area Damping Controller (WADC) is deployed in Matlab Simulink framework and is tested under different operating conditions. Simulation results reveal that the proposed WADC improve the dynamic characteristic of the system significantly.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: Modern series from high-volume esophageal centers report an approximate 40% 5-year survival in patients treated with curative intent and postoperative mortality rates of less than 4%. An objective analysis of factors that underpin current benchmarks within high-volume centers has not been performed. Methods: Three time periods were studied, 1990 to 1998 (period 1), 1999 to 2003 (period 2), and 2004 to 2008 (period 3), in which 471, 254, and 342 patients, respectively, with esophageal cancer were treated with curative intent. All data were prospectively recorded, and staging, pathology, treatment, operative, and oncologic outcomes were compared. Results: Five-year disease-specific survival was 28%, 35%, and 44%, and in-hospital postoperative mortality was 6.7%, 4.4%, and 1.7% for periods 1 to 3, respectively (P < .001). Period 3, compared with periods 1 and 2, respectively, was associated with significantly (P < .001) more early tumors (17% vs 4% and 6%), higher nodal yields (median 22 vs 11 and 18), and a higher R0 rate in surgically treated patients (81% vs 73% and 75%). The use of multimodal therapy increased (P < .05) across time periods. By multivariate analysis, age, T stage, N stage, vascular invasion, R status, and time period were significantly (P < .0001) associated with outcome. Conclusions: Improved survival with localized esophageal cancer in the modern era may reflect an increase of early tumors and optimized staging. Important surgical and pathologic standards, including a higher R0 resection rate and nodal yields, and lower postoperative mortality, were also observed. Copyright © 2012 by The American Association for Thoracic Surgery.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose The use of intravascular devices is associated with a number of potential complications. Despite a number of evidence-based clinical guidelines in this area, there continues to be nursing practice discrepancies. This study aims to examine nursing practice in a cancer care setting to identify nursing practice and areas for improvement respective to best available evidence. Methods A point prevalence survey was undertaken in a tertiary cancer care centre in Queensland, Australia. On a randomly selected day, four nurses assessed intravascular device related nursing practices and collected data using a standardized survey tool. Results 58 inpatients (100%) were assessed. Forty-eight (83%) had a device in situ, comprising 14 Peripheral Intravenous Catheters (29.2%), 14 Peripherally Inserted Central Catheters (29.2%), 14 Hickman catheters (29.2%) and six Port-a-Caths (12.4%). Suboptimal outcomes such as incidences of local site complications, incorrect/inadequate documentation, lack of flushing orders, and unclean/non intact dressings were observed. Conclusions This study has highlighted a number of intravascular device related nursing practice discrepancies compared with current hospital policy. Education and other implementation strategies can be applied to improve nursing practice. Following education strategies, it will be valuable to repeat this survey on a regular basis to provide feedback to nursing staff and implement strategies to improve practice. More research is required to provide evidence to clinical practice with regards to intravascular device related consumables, flushing technique and protocols.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we present large, accurately calibrated and time-synchronized data sets, gathered outdoors in controlled and variable environmental conditions, using an unmanned ground vehicle (UGV), equipped with a wide variety of sensors. These include four 2D laser scanners, a radar scanner, a color camera and an infrared camera. It provides a full description of the system used for data collection and the types of environments and conditions in which these data sets have been gathered, which include the presence of airborne dust, smoke and rain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Prevention strategies are critical to reduce infection rates in total joint arthroplasty (TJA), but evidence-based consensus guidelines on prevention of surgical site infection (SSI) remain heterogeneous and do not necessarily represent this particular patient population. Questions/Purposes What infection prevention measures are recommended by consensus evidence-based guidelines for prevention of periprosthetic joint infection? How do these recommendations compare to expert consensus on infection prevention strategies from orthopedic surgeons from the largest international tertiary referral centers for TJA? Patients and Methods A review of consensus guidelines was undertaken as described by Merollini et al. Four clinical guidelines met inclusion criteria: Centers for Disease Control and Prevention's, British Orthopedic Association, National Institute of Clinical Excellence's, and National Health and Medical Research Council's (NHMRC). Twenty-eight recommendations from these guidelines were used to create an evidence-based survey of infection prevention strategies that was administered to 28 orthopedic surgeons from members of the International Society of Orthopedic Centers. The results between existing consensus guidelines and expert opinion were then compared. Results Recommended strategies in the guidelines such as prophylactic antibiotics, preoperative skin preparation of patients and staff, and sterile surgical attire were considered critically or significantly important by the surveyed surgeons. Additional strategies such as ultraclean air/laminar flow, antibiotic cement, wound irrigation, and preoperative blood glucose control were also considered highly important by surveyed surgeons, but were not recommended or not uniformly addressed in existing guidelines on SSI prevention. Conclusion Current evidence-based guidelines are incomplete and evidence should be updated specifically to address patient needs undergoing TJA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

MapReduce is a computation model for processing large data sets in parallel on large clusters of machines, in a reliable, fault-tolerant manner. A MapReduce computation is broken down into a number of map tasks and reduce tasks, which are performed by so called mappers and reducers, respectively. The placement of the mappers and reducers on the machines directly affects the performance and cost of the MapReduce computation. From the computational point of view, the mappers/reducers placement problem is a generation of the classical bin packing problem, which is NPcomplete. Thus, in this paper we propose a new grouping genetic algorithm for the mappers/reducers placement problem in cloud computing. Compared with the original one, our grouping genetic algorithm uses an innovative coding scheme and also eliminates the inversion operator which is an essential operator in the original grouping genetic algorithm. The new grouping genetic algorithm is evaluated by experiments and the experimental results show that it is much more efficient than four popular algorithms for the problem, including the original grouping genetic algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A key concept for the centralized provision of Business Process Management (BPM) is the Center of Excellence (CoE). Organizations establish a CoE (aka BPM Support Office) as their BPM maturity increases in order to ensure a consistent and cost-effective way of offering BPM services. The definition of the offerings of such a center and the allocation of roles and responsibilities play an important role within BPM Governance. In order to plan the role of such a BPM CoE, this chapter proposes the productization of BPM leading to a set of fifteen distinct BPM services. A portfolio management approach is suggested to position these services. The approach allows identifying specific normative strategies for each BPM service, such as further training or BPM communication and marketing. A public sector case study provides further insights into how this approach has been used in practice. Empirical evidence from a survey with 15 organizations confirms the coverage of this set of BPM services and shows typical profiles for such BPM Centers of Excellence.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traditional nearest points methods use all the samples in an image set to construct a single convex or affine hull model for classification. However, strong artificial features and noisy data may be generated from combinations of training samples when significant intra-class variations and/or noise occur in the image set. Existing multi-model approaches extract local models by clustering each image set individually only once, with fixed clusters used for matching with various image sets. This may not be optimal for discrimination, as undesirable environmental conditions (eg. illumination and pose variations) may result in the two closest clusters representing different characteristics of an object (eg. frontal face being compared to non-frontal face). To address the above problem, we propose a novel approach to enhance nearest points based methods by integrating affine/convex hull classification with an adapted multi-model approach. We first extract multiple local convex hulls from a query image set via maximum margin clustering to diminish the artificial variations and constrain the noise in local convex hulls. We then propose adaptive reference clustering (ARC) to constrain the clustering of each gallery image set by forcing the clusters to have resemblance to the clusters in the query image set. By applying ARC, noisy clusters in the query set can be discarded. Experiments on Honda, MoBo and ETH-80 datasets show that the proposed method outperforms single model approaches and other recent techniques, such as Sparse Approximated Nearest Points, Mutual Subspace Method and Manifold Discriminant Analysis.