9 resultados para Blueprint

em CentAUR: Central Archive University of Reading - UK


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A Blueprint for Affective Computing: A sourcebook and manual is the very first attempt to ground affective computing within the disciplines of psychology, affective neuroscience, and philosophy. This book illustrates the contributions of each of these disciplines to the development of the ever-growing field of affective computing. In addition, it demonstrates practical examples of cross-fertilization between disciplines in order to highlight the need for integration of computer science, engineering and the affective sciences.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The practical application of systemic sustainability analysis (SSA; Bell and Morse, 1999) as applied in-a project instigated and managed by 'Blue Plan', one of the regional activity centres of the Mediterranean Action Plan, is set out and explained in this paper. The context in which SSA was applied and adapted to SPSA (systemic and prospective sustainability analysis). is described in the Mediterranean, primarily in Malta. The SSA process is summarized, its extension and linkage to the prospective approach is described and the comments of stakeholders in the context are added. Some preliminary outcomes are suggested. The pauticular focus of the paper is on the lessons learned from doing SSA/SPSA within a classic blueprint project framework. It is-not assumed that SSA/SPSA is 'finished' or 'definitive'. Rather, we suggest that it is a developing and changing approach that practitioners can adapt and change to meet the specific needs of the circumstances that confront them. Copyright (C) 2004 John Wiley & Sons, Ltd and ERP Environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Problem structuring methods or PSMs are widely applied across a range of variable but generally small-scale organizational contexts. However, it has been argued that they are seen and experienced less often in areas of wide ranging and highly complex human activity-specifically those relating to sustainability, environment, democracy and conflict (or SEDC). In an attempt to plan, track and influence human activity in SEDC contexts, the authors in this paper make the theoretical case for a PSM, derived from various existing approaches. They show how it could make a contribution in a specific practical context-within sustainable coastal development projects around the Mediterranean which have utilized systemic and prospective sustainability analysis or, as it is now known, Imagine. The latter is itself a PSM but one which is 'bounded' within the limits of the project to help deliver the required 'deliverables' set out in the project blueprint. The authors argue that sustainable development projects would benefit from a deconstruction of process by those engaged in the project and suggest one approach that could be taken-a breakout from a project-bounded PSM to an analysis that embraces the project itself. The paper begins with an introduction to the sustainable development context and literature and then goes on to illustrate the issues by grounding the debate within a set of projects facilitated by Blue Plan for Mediterranean coastal zones. The paper goes on to show how the analytical framework could be applied and what insights might be generated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article critically examines the challenges that come with implementing the Extractive Industries Transparency Initiative (EITI)a policy mechanism marketed by donors and Western governments as a key to facilitating economic improvement in resource-rich developing countriesin sub-Saharan Africa. The forces behind the EITI contest that impoverished institutions, the embezzlement of petroleum and/or mineral revenues, and a lack of transparency are the chief reasons why resource-rich sub-Saharan Africa is underperforming economically, and that implementation of the EITI, with its foundation of good governance, will help address these problems. The position here, however, is that the task is by no means straightforward: that the EITI is not necessarily a blueprint for facilitating good governance in the region's resource-rich countries. It is concluded that the EITI is a policy mechanism that could prove to be effective with significant institutional change in host African countries but, on its own, it is incapable of reducing corruption and mobilizing citizens to hold government officials accountable for hoarding profits from extractive industry operations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It has become evident that the mystery of life will not be deciphered just by decoding its blueprint, the genetic code. In the life and biomedical sciences, research efforts are now shifting from pure gene analysis to the analysis of all biomolecules involved in the machinery of life. One area of these postgenomic research fields is proteomics. Although proteomics, which basically encompasses the analysis of proteins, is not a new concept, it is far from being a research field that can rely on routine and large-scale analyses. At the time the term proteomics was coined, a gold-rush mentality was created, promising vast and quick riches (i.e., solutions to the immensely complex questions of life and disease). Predictably, the reality has been quite different. The complexity of proteomes and the wide variations in the abundances and chemical properties of their constituents has rendered the use of systematic analytical approaches only partially successful, and biologically meaningful results have been slow to arrive. However, to learn more about how cells and, hence, life works, it is essential to understand the proteins and their complex interactions in their native environment. This is why proteomics will be an important part of the biomedical sciences for the foreseeable future. Therefore, any advances in providing the tools that make protein analysis a more routine and large-scale business, ideally using automated and rapid analytical procedures, are highly sought after. This review will provide some basics, thoughts and ideas on the exploitation of matrix-assisted laser desorption/ ionization in biological mass spectrometry - one of the most commonly used analytical tools in proteomics - for high-throughput analyses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It has become evident that the mystery of life will not be deciphered just by decoding its blueprint, the genetic code. In the life and biomedical sciences, research efforts are now shifting from pure gene analysis to the analysis of all biomolecules involved in the machinery of life. One area of these postgenomic research fields is proteomics. Although proteomics, which basically encompasses the analysis of proteins, is not a new concept, it is far from being a research field that can rely on routine and large-scale analyses. At the time the term proteomics was coined, a gold-rush mentality was created, promising vast and quick riches (i.e., solutions to the immensely complex questions of life and disease). Predictably, the reality has been quite different. The complexity of proteomes and the wide variations in the abundances and chemical properties of their constituents has rendered the use of systematic analytical approaches only partially successful, and biologically meaningful results have been slow to arrive. However, to learn more about how cells and, hence, life works, it is essential to understand the proteins and their complex interactions in their native environment. This is why proteomics will be an important part of the biomedical sciences for the foreseeable future. Therefore, any advances in providing the tools that make protein analysis a more routine and large-scale business, ideally using automated and rapid analytical procedures, are highly sought after. This review will provide some basics, thoughts and ideas on the exploitation of matrix-assisted laser desorption/ionization in biological mass spectrometry - one of the most commonly used analytical tools in proteomics - for high-throughput analyses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is generally assumed that the variability of neuronal morphology has an important effect on both the connectivity and the activity of the nervous system, but this effect has not been thoroughly investigated. Neuroanatomical archives represent a crucial tool to explore structure–function relationships in the brain. We are developing computational tools to describe, generate, store and render large sets of three–dimensional neuronal structures in a format that is compact, quantitative, accurate and readily accessible to the neuroscientist. Single–cell neuroanatomy can be characterized quantitatively at several levels. In computer–aided neuronal tracing files, a dendritic tree is described as a series of cylinders, each represented by diameter, spatial coordinates and the connectivity to other cylinders in the tree. This ‘Cartesian’ description constitutes a completely accurate mapping of dendritic morphology but it bears little intuitive information for the neuroscientist. In contrast, a classical neuroanatomical analysis characterizes neuronal dendrites on the basis of the statistical distributions of morphological parameters, e.g. maximum branching order or bifurcation asymmetry. This description is intuitively more accessible, but it only yields information on the collective anatomy of a group of dendrites, i.e. it is not complete enough to provide a precise ‘blueprint’ of the original data. We are adopting a third, intermediate level of description, which consists of the algorithmic generation of neuronal structures within a certain morphological class based on a set of ‘fundamental’, measured parameters. This description is as intuitive as a classical neuroanatomical analysis (parameters have an intuitive interpretation), and as complete as a Cartesian file (the algorithms generate and display complete neurons). The advantages of the algorithmic description of neuronal structure are immense. If an algorithm can measure the values of a handful of parameters from an experimental database and generate virtual neurons whose anatomy is statistically indistinguishable from that of their real counterparts, a great deal of data compression and amplification can be achieved. Data compression results from the quantitative and complete description of thousands of neurons with a handful of statistical distributions of parameters. Data amplification is possible because, from a set of experimental neurons, many more virtual analogues can be generated. This approach could allow one, in principle, to create and store a neuroanatomical database containing data for an entire human brain in a personal computer. We are using two programs, L–NEURON and ARBORVITAE, to investigate systematically the potential of several different algorithms for the generation of virtual neurons. Using these programs, we have generated anatomically plausible virtual neurons for several morphological classes, including guinea pig cerebellar Purkinje cells and cat spinal cord motor neurons. These virtual neurons are stored in an online electronic archive of dendritic morphology. This process highlights the potential and the limitations of the ‘computational neuroanatomy’ strategy for neuroscience databases.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fiona Ross, Tim Holloway (co-designers) were commissioned by the renowned newspaper and publishing house Anandabazar Patrika (ABP) to design a new low-contrast typeface in a contemporary style for print and screen use in its publications. Ross and Holloway designed ABP's Bengali house typeface (Linotype Bengali - the first digital Bengali font) that has been in daily use in its newspaper since 1982. The design team was augmented by Neelakash Kshetrimayum; OpenType production undertaken by John Hudson. This Bengali typeface is the first fully functional OpenType design for the script. It demonstrates innovative features that resolve problems which hitherto hindered the successful execution of low-contrast Bengali text fonts: this connecting script of over 450 characters has deep verticals, spiralling strokes, wide characters, and intersecting ascenders. The new design has solutions to overcome the necessity to implement wide interlinear spacing and sets more words to the line than has yet been possible. This project therefore combines the use of aesthetic, technical and linguistic skills and is highly visible in newspapers of the largest newspaper group and publishing house in West Bengal in print and on-line. The design and development of Sarkar has positive implications for other non-Latin script designs, just as the Linotype Bengali typeface formed the blueprint for new non-Latin designs three decades ago. Sarkar was released on 31 August 2012 with the launch of Anandabazar Patirka's new newspaper Ebela.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this EUDO CITIZENSHIP Forum Debate, several authors consider the interrelations between eligibility criteria for participation in independence referendum (that may result in the creation of a new independent state) and the determination of putative citizenship ab initio (on day one) of such a state. The kick-off contribution argues for resemblance of an independence referendum franchise and of the initial determination of the citizenry, critically appraising the incongruence between the franchise for the 18 September 2014 Scottish independence referendum, and the blueprint for Scottish citizenship ab initio put forward by the Scottish Government in its 'Scotland's Future' White Paper. Contributors to this debate come from divergent disciplines (law, political science, sociology, philosophy). They reflect on and contest the above claims, both generally and in relation to regional settings including (in addition to Scotland) Catalonia/Spain, Flanders/Belgium, Quebec/Canada, Post-Yugoslavia and Puerto-Rico/USA.