38 resultados para Cercas, Javier
Resumo:
This unique book reveals the procedural aspects of knowledge-based urban planning, development and assessment. Concentrating on major knowledge city building processes, and providing state-of-the-art experiences and perspectives, this important compendium explores innovative models, approaches and lessons learnt from a number of key case studies across the world. Many cities worldwide, in order to brand themselves as knowledge cities, have undergone major transformations in the 21st century. This book provides a thorough understanding of these transformations and the key issues in building prosperous knowledge cities by focusing particularly on the policy-making, planning process and performance assessment aspects. The contributors reveal theoretical and conceptual foundations of knowledge cities and their development approach of knowledge-based urban development. They present best-practice examples from a number of key case studies across the globe. This important book provides readers with a thorough understanding of the key issues in planning and developing prosperous knowledge cities of the knowledge economy era, which will prove invaluable to national, state/regional and city governments’ planning and development departments. Academics, postgraduate and undergraduate students of regional and urban studies will also find this path-breaking book an intriguing read.
Resumo:
The research reported in this paper introduces a knowledge-based urban development assessment framework, which is constructed in order to evaluate and assist in the (re)formulation of local and regional policy frameworks and applications necessary in knowledge city transformations. The paper also reports the findings of an application of this framework in a comparative study of Boston, Vancouver, Melbourne and Manchester. The paper with its assessment framework: demonstrates an innovative way of examining the knowledge-based development capacity of cities by scrutinising their economic, socio-cultural, enviro-urban and institutional development mechanisms and capabilities; presents some of the generic indicators used to evaluate knowledge-based development performance of cities; reveals how a city can benchmark its development level against that of other cities, and; provides insights for achieving a more sustainable and knowledge-based development.
Resumo:
This study analyses organisational knowledge integration processes from a multi-level and systemic perspective, with particular reference to the case of Fujitsu. A conceptual framework for knowledge integration is suggested focusing on team-building capability, capturing and utilising individual tacit knowledge, and communication networks for integrating dispersed specialist knowledge required in the development of new products and services. The research highlights that knowledge integration occurring in the innovation process is a result of knowledge exposure, its distribution and embodiment and finally its transfer, which leads to innovation capability and competitive advantage in firm.
Resumo:
This chapter presents a comparative survey of recent key management (key distribution, discovery, establishment and update) solutions for wireless sensor networks. We consider both distributed and hierarchical sensor network architectures where unicast, multicast and broadcast types of communication take place. Probabilistic, deterministic and hybrid key management solutions are presented, and we determine a set of metrics to quantify their security properties and resource usage such as processing, storage and communication overheads. We provide a taxonomy of solutions, and identify trade-offs in these schemes to conclude that there is no one-size-fits-all solution.
Resumo:
A significant amount of speech is typically required for speaker verification system development and evaluation, especially in the presence of large intersession variability. This paper introduces a source and utterance duration normalized linear discriminant analysis (SUN-LDA) approaches to compensate session variability in short-utterance i-vector speaker verification systems. Two variations of SUN-LDA are proposed where normalization techniques are used to capture source variation from both short and full-length development i-vectors, one based upon pooling (SUN-LDA-pooled) and the other on concatenation (SUN-LDA-concat) across the duration and source-dependent session variation. Both the SUN-LDA-pooled and SUN-LDA-concat techniques are shown to provide improvement over traditional LDA on NIST 08 truncated 10sec-10sec evaluation conditions, with the highest improvement obtained with the SUN-LDA-concat technique achieving a relative improvement of 8% in EER for mis-matched conditions and over 3% for matched conditions over traditional LDA approaches.
Resumo:
A significant amount of speech data is required to develop a robust speaker verification system, but it is difficult to find enough development speech to match all expected conditions. In this paper we introduce a new approach to Gaussian probabilistic linear discriminant analysis (GPLDA) to estimate reliable model parameters as a linearly weighted model taking more input from the large volume of available telephone data and smaller proportional input from limited microphone data. In comparison to a traditional pooled training approach, where the GPLDA model is trained over both telephone and microphone speech, this linear-weighted GPLDA approach is shown to provide better EER and DCF performance in microphone and mixed conditions in both the NIST 2008 and NIST 2010 evaluation corpora. Based upon these results, we believe that linear-weighted GPLDA will provide a better approach than pooled GPLDA, allowing for the further improvement of GPLDA speaker verification in conditions with limited development data.
Resumo:
Quantitative determination of modification of primary sediment features, by the activity of organisms (i.e., bioturbation) is essential in geosciences. Some methods proposed since the 1960s are mainly based on visual or subjective determinations. The first semiquantitative evaluations of the Bioturbation Index, Ichnofabric Index, or the amount of bioturbation were attempted, in the best cases using a series of flashcards designed in different situations. Recently, more effective methods involve the use of analytical and computational methods such as X-rays, magnetic resonance imaging or computed tomography; these methods are complex and often expensive. This paper presents a compilation of different methods, using Adobe® Photoshop® software CS6, for digital estimation that are a part of the IDIAP (Ichnological Digital Analysis Images Package), which is an inexpensive alternative to recently proposed methods, easy to use, and especially recommended for core samples. The different methods — “Similar Pixel Selection Method (SPSM)”, “Magic Wand Method (MWM)” and the “Color Range Selection Method (CRSM)” — entail advantages and disadvantages depending on the sediment (e.g., composition, color, texture, porosity, etc.) and ichnological features (size of traces, infilling material, burrow wall, etc.). The IDIAP provides an estimation of the amount of trace fossils produced by a particular ichnotaxon, by a whole ichnocoenosis or even for a complete ichnofabric. We recommend the application of the complete IDIAP to a given case study, followed by selection of the most appropriate method. The IDIAP was applied to core material recovered from the IODP Expedition 339, enabling us, for the first time, to arrive at a quantitative estimation of the discrete trace fossil assemblage in core samples.
Resumo:
This book underlines the growing importance of knowledge for the competitiveness of cities and their regions. Examining the role of knowledge - in its economic, socio-cultural, spatial and institutional forms - for urban and regional development, identifying the preconditions for innovative use of urban and regional knowledge assets and resources, and developing new methods to evaluate the performance and potential of knowledge-based urban and regional development, the book provides an in-depth and comprehensive understanding of both theoretical and practical aspects of knowledge-based development and its implications and prospects for cities and regions.
Resumo:
A pseudonym provides anonymity by protecting the identity of a legitimate user. A user with a pseudonym can interact with an unknown entity and be confident that his/her identity is secret even if the other entity is dishonest. In this work, we present a system that allows users to create pseudonyms from a trusted master public-secret key pair. The proposed system is based on the intractability of factoring and finding square roots of a quadratic residue modulo a composite number, where the composite number is a product of two large primes. Our proposal is different from previously published pseudonym systems, as in addition to standard notion of protecting privacy of an user, our system offers colligation between seemingly independent pseudonyms. This new property when combined with a trusted platform that stores a master secret key is extremely beneficial to an user as it offers a convenient way to generate a large number of pseudonyms using relatively small storage.
Resumo:
Tissue Engineering is a promising emerging field that studies the intrinsic regenerative potential of the human body and uses it to restore functionality of damaged organs or tissues unable of self-healing due to illness or ageing. In order to achieve regeneration using Tissue Engineering strategies, it is first necessary to study the properties of the native tissue and determine the cause of tissue failure; second, to identify an optimum population of cells capable of restoring its functionality; and third, to design and manufacture a cellular microenvironment in which those specific cells are directed towards the desired cellular functions. The design of the artificial cellular niche has a tremendous importance, because cells will feel and respond to both its biochemical and biophysical properties very differently. In particular, the artificial niche will act as a physical scaffold for the cells, allowing their three-dimensional spatial organization; also, it will provide mechanical stability to the artificial construct; and finally, it will supply biochemical and mechanical cues to control cellular growth, migration, differentiation and synthesis of natural extracellular matrix. During the last decades, many scientists have made great contributions to the field of Tissue Engineering. Even though this research has frequently been accompanied by vast investments during extended periods of time, yet too often these efforts have not been enough to translate the advances into new clinical therapies. More and more scientists in this field are aware of the need of rational experimental designs before carrying out complex, expensive and time-consuming in vitro and in vivo trials. This review highlights the importance of computer modeling and novel biofabrication techniques as critical key players for a rational design of artificial cellular niches in Tissue Engineering.
Resumo:
The accuracy of early cost estimates is critical to the success of construction projects. The selected tender price (clients' building cost) is usually seen in previous research as a holistic dependent variable when examining early stage estimates. Unlike other components of construction cost, the amount of contingencies is decided by clients/consultants with consideration of early project information. Cost drivers of contingencies estimates are associated with uncertainty and complexity, and include project size, schedule, ground condition, construction site access, market condition and so on. A path analysis of 133 UK school building contracts was conducted to identify impacts of nine major cost drivers on the determination of contingencies by different clients/cost estimators. This research finds that gross floor area (GFA), schedule and requirement of air conditioning have statistically significant impacts on the contingency determination. The mediating role of schedule between gross floor area and contingencies (GFA→Schedule→Contingencies) was confirmed with the Soble test. The total effects of the three variables on contingencies estimates were obtained with the consideration of this indirect effect. The squared multiple correlation (SMC) of contingencies (=0.624) indicates the identified three variables can explain 62.4% variance of contingencies, and it is comparatively satisfactory considering the heterogeneity among different estimators, unknown estimating techniques and different projects
Resumo:
Numeric sets can be used to store and distribute important information such as currency exchange rates and stock forecasts. It is useful to watermark such data for proving ownership in case of illegal distribution by someone. This paper analyzes the numerical set watermarking model presented by Sion et. al in “On watermarking numeric sets”, identifies it’s weaknesses, and proposes a novel scheme that overcomes these problems. One of the weaknesses of Sion’s watermarking scheme is the requirement to have a normally-distributed set, which is not true for many numeric sets such as forecast figures. Experiments indicate that the scheme is also susceptible to subset addition and secondary watermarking attacks. The watermarking model we propose can be used for numeric sets with arbitrary distribution. Theoretical analysis and experimental results show that the scheme is strongly resilient against sorting, subset selection, subset addition, distortion, and secondary watermarking attacks.