718 resultados para Intuitive
Resumo:
Activation of the hypoxia-inducible factor (HIF) pathway is a critical step in the transcriptional response to hypoxia. Although many of the key proteins involved have been characterised, the dynamics of their interactions in generating this response remain unclear. In the present study, we have generated a comprehensive mathematical model of the HIF-1a pathway based on core validated components and dynamic experimental data, and confirm the previously described connections within the predicted network topology. Our model confirms previous work demonstrating that the steps leading to optimal HIF-1a transcriptional activity require sequential inhibition of both prolyl- and asparaginyl-hydroxylases. We predict from our model (and confirm experimentally) that there is residual activity of the asparaginyl-hydroxylase FIH (factor inhibiting HIF) at low oxygen tension. Furthermore, silencing FIH under conditions where prolyl-hydroxylases are inhibited results in increased HIF-1a transcriptional activity, but paradoxically decreases HIF-1a stability. Using a core module of the HIF network and mathematical proof supported by experimental data, we propose that asparaginyl hydroxylation confers a degree of resistance upon HIF-1a to proteosomal degradation. Thus, through in vitro experimental data and in silico predictions, we provide a comprehensive model of the dynamic regulation of HIF-1a transcriptional activity by hydroxylases and use its predictive and adaptive properties to explain counter-intuitive biological observations.
Resumo:
Book revew: Marketinggeschichte: die Genese einer modernen Sozialtechnik [Marketing history: The genesis of a modern social technique], edited by Hartmut Berghoff, Frankfurt/Main, Campus Verlag, 2007, 409 pp., illus., [euro]30.00 (paperback), ISBN 978-3-593-38323-1. This edited volume is the result of a workshop at Göttingen University in 2006 and combines a number of different approaches to the research into the history of marketing in Germany's economy and society. The majority of contributions loosely focus around the occurrence of a ‘marketing revolution’ in the 1970s, which ties in with interpretations of the Americanisation of German business. This revolution replaced the indigenous German idea of Absatzwirtschaft (the economics of sales) with the American-influenced idea of Marketing, which was less functionally oriented and more strategic, and which aimed to connect processes within the firm in order to allow a greater focus on the consumer. The entire volume is framed by Hartmut Berghoff's substantial and informative introduction, which introduces a number of actors and trends beyond the content of the volume. Throughout the various contributions, authors provide explanations of the timing and nature of marketing revolutions. Alexander Engel identifies an earlier revolution in the marketing of dyes, which undergoes major change with the emergence of chemical dyes. While the natural dyestuff had been a commodity, with producers removed from consumers via a global network of traders, chemical dyes were products and were branded at an early stage. This was a fundamental change in the nature of production and sales. As Roman Rossfeld shows in his contribution on the Swiss chocolate industry (which focuses almost exclusively on Suchard), even companies that produced non-essential consumer goods which had always required some measure of labelling grappled for years with the need to develop fewer and higher impact brands, as well as an efficient sales operation. A good example for the classical ‘marketing revolution’ of the 1970s is the German automobile industry. Ingo Köhler convincingly argues that the crisis situation of German car manufacturers – the change from a seller's to a buyer's market, appreciation of the German mark which undermines exports, the oil crises coupled with higher inflation and greater frugality of consumers and the emergence of new competitors – lead companies to refocus from production to the demands of the consumer. While he highlights the role of Ford in responding most rapidly to these problems, he does not address whether the multinational was potentially transferring American knowledge to the German market. Similarly, Paul Erker illustrates that a marketing revolution in transport and logistics happened much later, because the market remained highly regulated until the 1980s. Both Paul Erker and Uwe Spiekermann in their contribution, present comparisons of two different sectors or companies (the tire manufacturer Continental and the logistics company Dachser, and agriculture and trade, respectively). In both cases, however, it remains unclear why these examples were chosen for comparison, as both seem to have little in common and are not always effectively used to demonstrate differences. The weakest section of the book is the development of marketing as an academic discipline. The attempt at sketching the phases in the evolution of marketing as an academic discipline by Ursula Hansen and Matthias Bode opens with an undergraduate-level explanation on the methodology of historical periodisation that seems extraneous. Considerably stronger is the section on the wider societal impact of marketing, and Anja Kruke shows how the new techniques of opinion research was accepted by politics and business – surprisingly more readily by politicians than their commercial counterparts. In terms of contemporary personalities, Hans Domizlaff emerges as one fascinating figure of German marketing history, which several contributors refer to and whose career as the German cigarette manufacturer Reemtsma is critically analysed by Tino Jacobs. Domizlaff was Germany's own ‘marketing guru’, whose successful campaigns led to the wide-ranging reception of his ideas about the nature of good branding and marketing. These are variously described as intuitive, elitist, and sachlich, a German concept of a sober, fact-based, and ‘no frills’ approach. Domizlaff did not believe in market research. Rather, he saw the genius of the individual advertiser as key to intuitively ascertaining the people's moods, wishes, and desires. This seems to have made him peculiarly suited to the tastes of the German middle class, according to Thomas Mergel's contribution on the nature of political marketing in the republic. Especially in politics, any form of hard sales tactics were severely frowned upon and considered to demean the citizen as incapable of making an informed choice, a mentality that he dates back to the traditions of nineteenth-century liberalism. Part of this disdain of ‘selling politics like toothpaste’ was also founded on the highly effective use of branding by the National Socialists, who identified their party through the use of an increasingly standardised image of Adolf Hitler and the swastika. Alexander Schug extends on previous research that criticised the simplistic notion of Hitler's charisma as the only explanation of the popular success and distances his approach from those who see it in terms of propaganda and demagogy. He argues that the NSDAP used the tools of advertising and branding precisely because they had to introduce their new ideology into a political marketplace dominated by more established parties. In this they were undoubtedly successful, more so than they intended: as bakers sold swastika cookies and butchers formed Führer heads out of lard, the NSDAP sought to regain control over the now effectively iconic images that constituted their brand, which was in danger of being trivialised and devalued. Key to understanding the history of marketing in Germany is on the one hand the exchange of ideas with the United States, and on the other the impact of national-socialist policies, and the question whether they were a force of modernisation or retardation. The general argument in the volume appears to favour the latter explanation. In the 1930s, some of the leading marketing experts emigrated to the USA, leaving German academia and business isolated. The aftermath of the Second World War left a country that needed to increase production to satisfy consumer demand, and there was little interest in advanced sales techniques. Although the Nazis were progressive in applying new marketing methods to their political campaign, this retarded the adoption of sales techniques in politics for a long time. Germany saw the development of idiosyncratic approaches by people like Domizlaff in the 1930s and 1940s, when it lost some leading thinkers, and only engaged with American marketing conceptions in the 1960s and 1970s, when consumers eventually became more important than producers.
Resumo:
Police-suspect interviews in England & Wales are a multi-audience, multi-purpose, transcontextual mode of discourse. They are conducted as part of the initial investigation into a crime, but are subsequently recontextualised through the judicial process, ultimately being presented in court as evidence against the interviewee. The communicative challenges posed by multiple future audiences are investigated by applying Bell’s (1984) audience design model to the police interview, and the resulting "poor fit" demonstrates why this context is discursively counter-intuitive to participants. Further, data analysis indicates that interviewer and interviewee, although ostensibly addressing each other, may orientate to different audiences, with potentially serious consequences. As well as providing new insight into police-suspect interview interaction, this article seeks to extend understanding of the influence of audience on interaction at the discourse level, and to contribute to the development of theoretical models for contexts with multiple or asynchronous audiences.
Resumo:
Conventional DEA models assume deterministic, precise and non-negative data for input and output observations. However, real applications may be characterized by observations that are given in form of intervals and include negative numbers. For instance, the consumption of electricity in decentralized energy resources may be either negative or positive, depending on the heat consumption. Likewise, the heat losses in distribution networks may be within a certain range, depending on e.g. external temperature and real-time outtake. Complementing earlier work separately addressing the two problems; interval data and negative data; we propose a comprehensive evaluation process for measuring the relative efficiencies of a set of DMUs in DEA. In our general formulation, the intervals may contain upper or lower bounds with different signs. The proposed method determines upper and lower bounds for the technical efficiency through the limits of the intervals after decomposition. Based on the interval scores, DMUs are then classified into three classes, namely, the strictly efficient, weakly efficient and inefficient. An intuitive ranking approach is presented for the respective classes. The approach is demonstrated through an application to the evaluation of bank branches. © 2013.
Resumo:
Our paper presents the work of the Cuneiform Digital Forensic Project (CDFP), an interdisciplinary project at The University of Birmingham, concerned with the development of a multimedia database to support scholarly research into cuneiform, wedge-shaped writing imprinted onto clay tablets and indeed the earliest real form of writing. We describe the evolutionary design process and dynamic research and developmental cycles associated with the database. Unlike traditional publications, the electronic publication of resources offers the possibility of almost continuous revisions with the integration and support of new media and interfaces. However, if on-line resources are to win the favor and confidence of their respective communities there must be a clear distinction between published and maintainable resources, and, developmental content. Published material should, ideally, be supported via standard web-browser interfaces with fully integrated tools so that users receive a reliable, homogenous and intuitive flow of information and media relevant to their needs. We discuss the inherent dynamics of the design and publication of our on-line resource, starting with the basic design and maintenance aspects of the electronic database, which includes photographic instances of cuneiform signs, and shows how the continuous review process identifies areas for further research and development, for example, the “sign processor” graphical search tool and three-dimensional content, the results of which then feedback into the maintained resource.
Resumo:
Artifact selection decisions typically involve the selection of one from a number of possible/candidate options (decision alternatives). In order to support such decisions, it is important to identify and recognize relevant key issues of problem solving and decision making (Albers, 1996; Harris, 1998a, 1998b; Jacobs & Holten, 1995; Loch & Conger, 1996; Rumble, 1991; Sauter, 1999; Simon, 1986). Sauter classifies four problem solving/decision making styles: (1) left-brain style, (2) right-brain style, (3) accommodating, and (4) integrated (Sauter, 1999). The left-brain style employs analytical and quantitative techniques and relies on rational and logical reasoning. In an effort to achieve predictability and minimize uncertainty, problems are explicitly defined, solution methods are determined, orderly information searches are conducted, and analysis is increasingly refined. Left-brain style decision making works best when it is possible to predict/control, measure, and quantify all relevant variables, and when information is complete. In direct contrast, right-brain style decision making is based on intuitive techniques—it places more emphasis on feelings than facts. Accommodating decision makers use their non-dominant style when they realize that it will work best in a given situation. Lastly, integrated style decision makers are able to combine the left- and right-brain styles—they use analytical processes to filter information and intuition to contend with uncertainty and complexity.
Resumo:
Purpose: To determine the effect of coloured light filter overlays on reading rates for people with age-related macular degeneration (AMD). Method: Using a prospective clinical trial design, we examined the null hypothesis that coloured light filter overlays do not improve reading rates in AMD when compared to a clear filter. Reading rates for 12 subjects with non-exudative AMD, associated with a relative scotoma and central fixation (mean age 81 years, SD 5.07 years) were determined using the Rate of Reading Test® (printed, nonsense, lower case sans serif, stationary text) with 10 different, coloured light filter overlays (Intuitive Overlays®; figures in brackets are percentage transmission values); rose (78%), pink (78%), purple (67%), aqua (81%), blue (74%), lime-green (86%), mint-green (85%), yellow (93%), orange (83%) and grey (71%). A clear overlay (Roscolene # 00) (360 cdm-2) with 100% transmittance was used as a control. Results: ANOVA indicated that there was no statistically significant difference in reading rates with the coloured light filter overlays compared to the clear filter. Furthermore, chi-squared analysis indicated that the rose, purple and blue filters had a significantly poorer overall ranking in terms of reading rates compared to the other coloured and clear light filters. Conclusion: Coloured light filter overlays are unlikely to provide a clinically significant improvement in reading rates for people with non-exudative AMD associated with a relative scotoma and central fixation. Copyright © Acta Ophthalmol Scand 2004.
Resumo:
Image collections are ever growing and hence visual information is becoming more and more important. Moreover, the classical paradigm of taking pictures has changed, first with the spread of digital cameras and, more recently, with mobile devices equipped with integrated cameras. Clearly, these image repositories need to be managed, and tools for effectively and efficiently searching image databases are highly sought after, especially on mobile devices where more and more images are being stored. In this paper, we present an image browsing system for interactive exploration of image collections on mobile devices. Images are arranged so that visually similar images are grouped together while large image repositories become accessible through a hierarchical, browsable tree structure, arranged on a hexagonal lattice. The developed system provides an intuitive and fast interface for navigating through image databases using a variety of touch gestures. © 2012 Springer-Verlag.
Resumo:
In this report we will explain some earlier papers [1, 2] which are about definition of Artificial Intelligence and about perfect AI. The definition of AI is intuitive in [1] and formal in [2]. The perfect AI is a program that satisfies the definition for AI but which is absolutely useless because of the combinatory explosion. Most people do not understand these papers because they never saw AI and that is why for them the notion of AI is too abstract. In this report we will make parallel between definition of chess playing program and definition of AI. Of course, the definition of chess playing program is useless because people already know what this is. Anyway, we will give you this definition because its construction follows closely the construction of the definition of AI. Also the results are almost the same with the only difference that we can optimise the perfect chess playing program in order to obtain a real chess playing program, but for the moment we cannot optimise the perfect AI in order to obtain a real AI. In this report we will not speak about AI. The only matter which we will observe will be about chess playing programs. If you understand the construction and the results about chess playing programs then you can read the papers [1, 2] and to see similar results about AI.
An improved conflicting evidence combination approach based on a new supporting probability distance
Resumo:
To avoid counter-intuitive result of classical Dempster's combination rule when dealing with highly conflict information, many improved combination methods have been developed through modifying the basic probability assignments (BPAs) of bodies of evidence (BOEs) by using a certain measure of the degree of conflict or uncertain information, such as Jousselme's distance, the pignistic probability distance and the ambiguity measure. However, if BOEs contain some non-singleton elements and the differences among their BPAs are larger than 0.5, the current conflict measure methods have limitations in describing the interrelationship among the conflict BOEs and may even lead to wrong combination results. In order to solve this problem, a new distance function, which is called supporting probability distance, is proposed to characterize the differences among BOEs. With the new distance, the information of how much a focal element is supported by the other focal elements in BOEs can be given. Also, a new combination rule based on the supporting probability distance is proposed for the combination of the conflicting evidences. The credibility and the discounting factor of each BOE are generated by the supporting probability distance and the weighted BOEs are combined directly using Dempster's rules. Analytical results of numerical examples show that the new distance has a better capability of describing the interrelationships among BOEs, especially for the highly conflicting BOEs containing non-singleton elements and the proposed new combination method has better applicability and effectiveness compared with the existing methods.
Resumo:
When we see a stranger's face we quickly form impressions of his or her personality, and expectations of how the stranger might behave. Might these intuitive character judgements bias source monitoring? Participants read headlines "reported" by a trustworthy- and an untrustworthy-looking reporter. Subsequently, participants recalled which reporter provided each headline. Source memory for likely-sounding headlines was most accurate when a trustworthy-looking reporter had provided the headlines. Conversely, source memory for unlikely-sounding headlines was most accurate when an untrustworthy-looking reporter had provided the headlines. This bias appeared to be driven by the use of decision criteria during retrieval rather than differences in memory encoding. Nevertheless, the bias was apparently unrelated to variations in subjective confidence. These results show for the first time that intuitive, stereotyped judgements of others' appearance can bias memory attributions analogously to the biases that occur when people receive explicit information to distinguish sources. We suggest possible real-life consequences of these stereotype-driven source-monitoring biases. © 2010 Psychology Press.
Resumo:
* This work is partially supported by CICYT (Spain) under project TIN 2005-08943-C02-001 and by UPM-CAM (Spain) under project R05/11240.
Resumo:
Metrics estimate the quality of different aspects of software. In particular, cohesion indicates how well the parts of a system hold together. A metric to evaluate class cohesion is important in object-oriented programming because it gives an indication of a good design of classes. There are several proposals of metrics for class cohesion but they have several problems (for instance, low discrimination). In this paper, a new metric to evaluate class cohesion is proposed, called SCOM, which has several relevant features. It has an intuitive and analytical formulation, what is necessary to apply it to large-size software systems. It is normalized to produce values in the range [0..1], thus yielding meaningful values. It is also more sensitive than those previously reported in the literature. The attributes and methods used to evaluate SCOM are unambiguously stated. SCOM has an analytical threshold, which is a very useful but rare feature in software metrics. We assess the metric with several sample cases, showing that it gives more sensitive values than other well know cohesion metrics.
Resumo:
In article the problems of mutual adapting of the humans and computer environment are reviewed. Features of image-intuitive and physical-mathematical modes of perception and thinking are investigated. The problems of choice of means and methods of the differential education the computerized society are considered.
Resumo:
This paper examines the relationship between medical and hospital accounting discourses during the two decades after the 1946 National Health Service (NHS) Act for England and Wales. It argues that the departmental costing system introduced into the NHS in 1957 was concerned with the administrative aspects of hospital costliness as contemporary hospital accountants suggested that the perceived incomparability, immeasurability and uncontrollability of medical practice precluded the application of cost accounting to the clinical functions of hospitals. The paper links these suggestions to medical discourses which portrayed the practice of medicine as an intuitive and experience-based art and argues that post-war conceptions of clinical medicine represented this domain in a manner that was neither susceptible to the calculations of cost accountants nor to calculating and normalising intervention more generally. The paper concludes by suggesting that a closer engagement with medical discourses may enhance our understanding of historical as well as present day attempts to make medicine calculable.