739 resultados para Hidden homelessness


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A series of imitation games involving 3-participant (simultaneous comparison of two hidden entities) and 2-participant (direct interrogation of a hidden entity) were conducted at Bletchley Park on the 100th anniversary of Alan Turing’s birth: 23 June 2012. From the ongoing analysis of over 150 games involving (expert and non-expert, males and females, adults and child) judges, machines and hidden humans (foils for the machines), we present six particular conversations that took place between human judges and a hidden entity that produced unexpected results. From this sample we focus on features of Turing’s machine intelligence test that the mathematician/code breaker did not consider in his examination for machine thinking: the subjective nature of attributing intelligence to another mind.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Climate change is putting Colombian agriculture under significant stress and, if no adaptation is made, the latter will be severely impacted during the next decades. Ramirez-Villegas et al. (2012) set out a government-led, top-down, techno-scientific proposal for a way forward by which Colombian agriculture could adapt to climate change. However, this proposal largely overlooks the root causes of vulnerability of Colombian agriculture, and of smallholders in particular. I discuss some of the hidden assumptions underpinning this proposal and of the arguments employed by Ramirez-Villegas et al., based on existing literature on Colombian agriculture and the wider scientific debate on adaptation to climate change. While technical measures may play an important role in the adaptation of Colombian agriculture to climate change, I question whether the actions listed in the proposal alone and specifically for smallholders, truly represent priority issues. I suggest that by i) looking at vulnerability before adaptation, ii) contextualising climate change as one of multiple exposures, and iii) truly putting smallholders at the centre of adaptation, i.e. to learn about and with them, different and perhaps more urgent priorities for action can be identified. Ultimately, I argue that what is at stake is not only a list of adaptation measures but, more importantly, the scientific approach from which priorities for action are identified. In this respect, I propose that transformative rather than technical fix adaptation represents a better approach for Colombian agriculture and smallholders in particular, in the face of climate change.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many important drugs in the Chinese materia medica (CMM) are known to be toxic, and it has long been recognized in classical Chinese medical theory that toxicity can arise directly from the components of a single CMM or may be induced by an interaction between combined CMM. Traditional Chinese Medicine presents a unique set of pharmaceutical theories that include particular methods for processing, combining and decocting, and these techniques contribute to reducing toxicity as well as enhancing efficacy. The current classification of toxic CMM drugs, traditional methods for processing toxic CMM and the prohibited use of certain combinations, is based on traditional experience and ancient texts and monographs, but accumulating evidence increasingly supports their use to eliminate or reduce toxicity. Modern methods are now being used to evaluate the safety of CMM; however, a new system for describing the toxicity of Chinese herbal medicines may need to be established to take into account those herbs whose toxicity is delayed or otherwise hidden, and which have not been incorporated into the traditional classification. This review explains the existing classification and justifies it where appropriate, using experimental results often originally published in Chinese and previously not available outside China.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we consider transcripts which originated from a practical series of Turing’s Imitation Game which was held on 23rd June 2012 at Bletchley Park, England. In some cases the tests involved a 3-participant simultaneous comparison of two hidden entities whereas others were the result of a direct 2-participant interaction. Each of the transcripts considered here resulted in a human interrogator being fooled, by a machine, into concluding that they had been conversing with a human. Particular features of the conversation are highlighted, successful ploys on the part of each machine discussed and likely reasons for the interrogator being fooled are considered. Subsequent feedback from the interrogators involved is also included

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The focus here is on the influence of the endgame KRPKBP on endgames featuring duels between rook and bishop. We take advantage of the range of endgame tablebases and tools now available to ratify and extend previous analyses of five examples, including the conclusion of the justly famous 1979 Rio Interzonal game, Timman-Velimirović. The tablebases show that they can help us understand the hidden depths of the chess endgame, that the path to the draw here is narrower than expected, that chess engines without tablebases still do not find all the wins, and that there are further surprises in store when more pawns are added.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Research into the dark side of customer management and marketing is progressively growing. The marketing landscape today is dominated with suspicion and distrust as a result of practices that include hidden fees, deception and information mishandling. In such a pessimistic economy, marketers must reconceptualise the notion of fairness in marketing and customer management, so that the progress of sophisticated customisation schemes and advancements in marketing can flourish, avoiding further control and imposed regulation. In this article, emerging research is drawn to suggest that existing quality measures of marketing activities, including service, relationships and experiences may not be comprehensive in measuring the relevant things in the social and ethically oriented marketing landscape, and on that basis does not measure the fairness which truly is important in such an economy. The paper puts forward the concept of Fairness Quality (FAIRQUAL), which includes as well as extends on existing thinking behind relationship building, experience creation and other types of customer management practices that are believed to predict consumer intentions. It is proposed that a fairness quality measure will aid marketers in this challenging landscape and economy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose – This paper aims to examine current research trends into corporate governance and to propose a different dynamic, humanistic approach based on individual purpose, values and psychology. Design/methodology/approach – The paper reviews selected literature to analyse the assumptions behind research into corporate governance and uses a multi-disciplinary body of literature to present a different theoretical approach based at the level of the individual rather than the organisation. Findings – The paper shows how the current recommendations of the corporate governance research models could backfire and lead to individual actions that are destructive when implemented in practice. This claim is based on identifying the hidden assumptions behind the principal-agent model in corporate governance, such as the Hobbesian view and the Homo Economicus approach. It argues against the axiomatic view that shareholders are the owners of the company, and it questions the way in which managers are assessed based either on the corporate share price (the shareholder view) or on a confusing set of measures which include more stakeholders (the stakeholder view), and shows how such a yardstick can be demotivating and put the corporation in danger. The paper proposes a humanistic, psychological approach that uses the individual manager as a unit of analysis instead of the corporation and illustrates how such an approach can help to build better governance. Research limitations/implications – The paper's limited scope can only outline a conceptual framework, but does not enter into detailed operationalisation. Practical implications – The paper illustrates the challenges in applying the proposed framework into practice. Originality/value – The paper calls for the use of an alternative unit of analysis, the manager, and for a dynamic and humanistic approach which encompasses the entirety of a person's cognition, including emotional and spiritual values, and which is as of yet usually not to be found in the corporate governance literature.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article reflects on the introduction of ‘matrix management’ arrangements for an Educational Psychology Service (EPS) within a Children’s Service Directorate of a Local Authority (LA). It seeks to demonstrate critical self-awareness, consider relevant literature with a view to bringing insights to processes and outcomes, and offers recommendations regarding the use of matrix management. The report arises from an East Midland’s LA initiative: ALICSE − Advanced Leadership in an Integrated Children’s Service Environment. Through a literature review and personal reflection, the authors consider the following: possible tensions within the development of matrix management arrangements; whether matrix management is a prerequisite within complex organizational systems; and whether competing professional cultures may contribute barriers to creating complementary and collegiate working. The authors briefly consider some research paradigms, notably ethnographic approaches, soft systems methodology, activity theory and appreciative inquiry. These provide an analytic framework for the project and inform this iterative process of collaborative inquiry. Whilst these models help illuminate otherwise hidden processes, none have been implemented following full research methodologies, reflecting the messy reality of local authority working within dynamic organizational structures and shrinking budgets. Nevertheless, this article offers an honest reflection of organizational change within a children’s services environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents some important issues on misidentification of human interlocutors in text-based communication during practical Turing tests. The study here presents transcripts in which human judges succumbed to theconfederate effect, misidentifying hidden human foils for machines. An attempt is made to assess the reasons for this. The practical Turing tests in question were held on 23 June 2012 at Bletchley Park, England. A selection of actual full transcripts from the tests is shown and an analysis is given in each case. As a result of these tests, conclusions are drawn with regard to the sort of strategies which can perhaps lead to erroneous conclusions when one is involved as an interrogator. Such results also serve to indicate conversational directions to avoid for those machine designers who wish to create a conversational entity that performs well on the Turing test.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Interpretation of utterances affects an interrogator’s determination of human from machine during live Turing tests. Here, we consider transcripts realised as a result of a series of practical Turing tests that were held on 23 June 2012 at Bletchley Park, England. The focus in this paper is to consider the effects of lying and truth-telling on the human judges by the hidden entities, whether human or a machine. Turing test transcripts provide a glimpse into short text communication, the type that occurs in emails: how does the reader determine truth from the content of a stranger’s textual message? Different types of lying in the conversations are explored, and the judge’s attribution of human or machine is investigated in each test.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Whilst common sense knowledge has been well researched in terms of intelligence and (in particular) artificial intelligence, specific, factual knowledge also plays a critical part in practice. When it comes to testing for intelligence, testing for factual knowledge is, in every-day life, frequently used as a front line tool. This paper presents new results which were the outcome of a series of practical Turing tests held on 23rd June 2012 at Bletchley Park, England. The focus of this paper is on the employment of specific knowledge testing by interrogators. Of interest are prejudiced assumptions made by interrogators as to what they believe should be widely known and subsequently the conclusions drawn if an entity does or does not appear to know a particular fact known to the interrogator. The paper is not at all about the performance of machines or hidden humans but rather the strategies based on assumptions of Turing test interrogators. Full, unedited transcripts from the tests are shown for the reader as working examples. As a result, it might be possible to draw critical conclusions with regard to the nature of human concepts of intelligence, in terms of the role played by specific, factual knowledge in our understanding of intelligence, whether this is exhibited by a human or a machine. This is specifically intended as a position paper, firstly by claiming that practicalising Turing's test is a useful exercise throwing light on how we humans think, and secondly, by taking a potentially controversial stance, because some interrogators adopt a solipsist questioning style of hidden entities with a view that it is a thinking intelligent human if it thinks like them and knows what they know. The paper is aimed at opening discussion with regard to the different aspects considered.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the UK, architectural design is regulated through a system of design control for the public interest, which aims to secure and promote ‘quality’ in the built environment. Design control is primarily implemented by locally employed planning professionals with political oversight, and independent design review panels, staffed predominantly by design professionals. Design control has a lengthy and complex history, with the concept of ‘design’ offering a range of challenges for a regulatory system of governance. A simultaneously creative and emotive discipline, architectural design is a difficult issue to regulate objectively or consistently, often leading to policy that is regarded highly discretionary and flexible. This makes regulatory outcomes difficult to predict, as approaches undertaken by the ‘agents of control’ can vary according to the individual. The role of the design controller is therefore central, tasked with the responsibility of interpreting design policy and guidance, appraising design quality and passing professional judgment. However, little is really known about what influences the way design controllers approach their task, providing a ‘veil’ over design control, shrouding the basis of their decisions. This research engaged directly with the attitudes and perceptions of design controllers in the UK, lifting this ‘veil’. Using in-depth interviews and Q-Methodology, the thesis explores this hidden element of control, revealing a number of key differences in how controllers approach and implement policy and guidance, conceptualise design quality, and rationalise their evaluations and judgments. The research develops a conceptual framework for agency in design control – this consists of six variables (Regulation; Discretion; Skills; Design Quality; Aesthetics; and Evaluation) and it is suggested that this could act as a ‘heuristic’ instrument for UK controllers, prompting more reflexivity in relation to evaluating their own position, approaches, and attitudes, leading to better practice and increased transparency of control decisions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a novel approach to the automatic classification of very large data sets composed of terahertz pulse transient signals, highlighting their potential use in biochemical, biomedical, pharmaceutical and security applications. Two different types of THz spectra are considered in the classification process. Firstly a binary classification study of poly-A and poly-C ribonucleic acid samples is performed. This is then contrasted with a difficult multi-class classification problem of spectra from six different powder samples that although have fairly indistinguishable features in the optical spectrum, they also possess a few discernable spectral features in the terahertz part of the spectrum. Classification is performed using a complex-valued extreme learning machine algorithm that takes into account features in both the amplitude as well as the phase of the recorded spectra. Classification speed and accuracy are contrasted with that achieved using a support vector machine classifier. The study systematically compares the classifier performance achieved after adopting different Gaussian kernels when separating amplitude and phase signatures. The two signatures are presented as feature vectors for both training and testing purposes. The study confirms the utility of complex-valued extreme learning machine algorithms for classification of the very large data sets generated with current terahertz imaging spectrometers. The classifier can take into consideration heterogeneous layers within an object as would be required within a tomographic setting and is sufficiently robust to detect patterns hidden inside noisy terahertz data sets. The proposed study opens up the opportunity for the establishment of complex-valued extreme learning machine algorithms as new chemometric tools that will assist the wider proliferation of terahertz sensing technology for chemical sensing, quality control, security screening and clinic diagnosis. Furthermore, the proposed algorithm should also be very useful in other applications requiring the classification of very large datasets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: In many experimental pipelines, clustering of multidimensional biological datasets is used to detect hidden structures in unlabelled input data. Taverna is a popular workflow management system that is used to design and execute scientific workflows and aid in silico experimentation. The availability of fast unsupervised methods for clustering and visualization in the Taverna platform is important to support a data-driven scientific discovery in complex and explorative bioinformatics applications. Results: This work presents a Taverna plugin, the Biological Data Interactive Clustering Explorer (BioDICE), that performs clustering of high-dimensional biological data and provides a nonlinear, topology preserving projection for the visualization of the input data and their similarities. The core algorithm in the BioDICE plugin is Fast Learning Self Organizing Map (FLSOM), which is an improved variant of the Self Organizing Map (SOM) algorithm. The plugin generates an interactive 2D map that allows the visual exploration of multidimensional data and the identification of groups of similar objects. The effectiveness of the plugin is demonstrated on a case study related to chemical compounds. Conclusions: The number and variety of available tools and its extensibility have made Taverna a popular choice for the development of scientific data workflows. This work presents a novel plugin, BioDICE, which adds a data-driven knowledge discovery component to Taverna. BioDICE provides an effective and powerful clustering tool, which can be adopted for the explorative analysis of biological datasets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A discrete-time random process is described, which can generate bursty sequences of events. A Bernoulli process, where the probability of an event occurring at time t is given by a fixed probability x, is modified to include a memory effect where the event probability is increased proportionally to the number of events that occurred within a given amount of time preceding t. For small values of x the interevent time distribution follows a power law with exponent −2−x. We consider a dynamic network where each node forms, and breaks connections according to this process. The value of x for each node depends on the fitness distribution, \rho(x), from which it is drawn; we find exact solutions for the expectation of the degree distribution for a variety of possible fitness distributions, and for both cases where the memory effect either is, or is not present. This work can potentially lead to methods to uncover hidden fitness distributions from fast changing, temporal network data, such as online social communications and fMRI scans.