718 resultados para Intuitive
Resumo:
The notion of implicature was first introduced by Paul Grice (1967, 1989), who defined it essentially as what is communicated less what is said. This definition contributed in part to the proliferation of a large number of different species of implicature by neo-Griceans. Relevance theorists have responded to this by proposing a shift back to the distinction between "explicit" & "implicit" meaning (corresponding to "explicature" & "implicature," respectively). However, they appear to have pared down the concept of implicature too much, ignoring phenomena that may be better treated as implicatures in their overgeneralization of the concept of explicature. These problems have their roots in the fact that explicit & implicit meaning intuitively overlap & thus do not provide a suitable basis for distinguishing implicature from other types of pragmatic phenomena. An alternative conceptualization of implicature based on the concept of "implying" with which Grice originally associated his notion of implicature is thus proposed. From this definition, it emerges that implicature constitutes something else inferred by the addressee that is not literally said by the speaker. Instead, it is meant in addition to what the speaker literally says & is consequently defeasible like all other types of pragmatic phenomena. 1 Figure, 60 References. Adapted from the source document
Resumo:
Intraoperative cardiac imaging plays a key role during transcatheter aortic valve replacement. In recent years, new techniques and new tools for improved image quality and virtual navigation have been proposed, in order to simplify and standardize stent valve positioning and implantation. But routine performance of the new techniques may require major economic investments or specific knowledge and skills and, for this reason, they may not be accessible to the majority of cardiac centres involved in transcatheter valve replacement projects. Additionally, they still require injections of contrast medium to obtain computed images. Therefore, we have developed and describe here a very simple and intuitive method of positioning balloon-expandable stent valves, which represents the evolution of the 'dumbbell' technique for echocardiography-guided transcatheter valve replacement without angiography. This method, based on the partial inflation of the balloon catheter during positioning, traps the crimped valve in the aortic valve orifice and, consequently, very near to the ideal landing zone. It does not require specific echocardiographic knowledge; it does not require angiographies that increase the risk of postoperative kidney failure in elderly patients, and it can be also performed in centres not equipped with a hybrid operating room.
Resumo:
We present an envelope theorem for establishing first-order conditions in decision problems involving continuous and discrete choices. Our theorem accommodates general dynamic programming problems, even with unbounded marginal utilities. And, unlike classical envelope theorems that focus only on differentiating value functions, we accommodate other endogenous functions such as default probabilities and interest rates. Our main technical ingredient is how we establish the differentiability of a function at a point: we sandwich the function between two differentiable functions from above and below. Our theory is widely applicable. In unsecured credit models, neither interest rates nor continuation values are globally differentiable. Nevertheless, we establish an Euler equation involving marginal prices and values. In adjustment cost models, we show that first-order conditions apply universally, even if optimal policies are not (S,s). Finally, we incorporate indivisible choices into a classic dynamic insurance analysis.
Resumo:
We argue the importance both of developing simple sufficientconditions for the stability of general multiclass queueing networks and also of assessing such conditions under a range of assumptions on the weight of the traffic flowing between service stations. To achieve the former, we review a peak-rate stability condition and extend its range of application and for the latter, we introduce a generalisation of the Lu-Kumar network on which the stability condition may be tested for a range of traffic configurations. The peak-rate condition is close to exact when the between-station traffic is light, but degrades as this traffic increases.
Resumo:
Recent research has highlighted the notion that people can make judgmentsand choices by means of two systems that are labeled here tacit(or intuitive) and deliberate (or analytic). Whereas most decisionstypically involve both systems, this chapter examines the conditions underwhich each system is liable to be more effective. This aims to illuminatethe age-old issue of whether and when people should trust intuition or analysis. To do this, a framework is presented to understand how thetacit and deliberate systems work in tandem. Distinctions are also madebetween the types of information typically used by both systems as wellas the characteristics of environments that facilitate or hinder accuratelearning by the tacit system. Next, several experiments that havecontrasted intuitive and analytic modes on the same tasks are reviewed.Together, the theoretical framework and experimental evidence leads tospecifying the trade-off that characterizes their relative effectiveness.Tacit system responses can be subject to biases. In making deliberate systemresponses, however, people might not be aware of the correct rule to dealwith the task they are facing and/or make errors in executing it. Whethertacit or deliberate responses are more valid in particular circumstancesrequires assessing this trade-off. In this, the probability of making errorsin deliberate thought is postulated to be a function of the analytical complexityof the task as perceived by the person. Thus the trade-off is one of bias (inimplicit responses) versus analytical complexity (when tasks are handled indeliberate mode). Finally, it is noted that whereas much attention has beenpaid in the past to helping people make decisions in deliberate mode, effortsshould also be directed toward improving ability to make decisions intacit mode since the effectiveness of decisions clearly depends on both. Thistherefore represents an important frontier for research.
Resumo:
In this article, I address epistemological questions regarding the status of linguistic rules and the pervasive--though seldom discussed--tension that arises between theory-driven object perception by linguists on the one hand, and ordinary speakers' possible intuitive knowledge on the other hand. Several issues will be discussed using examples from French verb morphology, based on the 6500 verbs from Le Petit Robert dictionary (2013).
Resumo:
La philosophie de Spinoza cherche à concilier et réunir trois horizons philosophiques fondamentaux : l’émanation néo-platonicienne (l’expression), le mécanisme cartésien (cause efficiente), et les catégories aristotéliciennes (Substance, attribut, mode). Ce premier point est pris pour acquis. Nous expliquerons que cette tentative sera rendue possible grâce à la conception nouvelle, au 17e siècle, de l’actualité de l’infini. Nous examinerons ensuite les conséquences de cette nouvelle interprétation, qui permet de rendre l’individu transparent à lui-même sur un plan d’immanence, expressif par rapport à une éminence qui le diffuse, mais déterminé dans une substantialité fictive entre objets finis. En proposant le pouvoir de l’imagination et des prophètes comme point de départ et principe actif du conatus, nous montrerons que la distinction, chez Spinoza, demeure toujours une fiction. Pour conclure, nous serons en mesure de signaler en quoi le Zarathoustra de Nietzsche relève d’une volonté de poursuivre le travail entrepris par Spinoza.
Resumo:
The paper reports an interactive tool for calibrating a camera, suitable for use in outdoor scenes. The motivation for the tool was the need to obtain an approximate calibration for images taken with no explicit calibration data. Such images are frequently presented to research laboratories, especially in surveillance applications, with a request to demonstrate algorithms. The method decomposes the calibration parameters into intuitively simple components, and relies on the operator interactively adjusting the parameter settings to achieve a visually acceptable agreement between a rectilinear calibration model and his own perception of the scene. Using the tool, we have been able to calibrate images of unknown scenes, taken with unknown cameras, in a matter of minutes. The standard of calibration has proved to be sufficient for model-based pose recovery and tracking of vehicles.
Resumo:
In 'Avalanche', an object is lowered, players staying in contact throughout. Normally the task is easily accomplished. However, with larger groups counter-intuitive behaviours appear. The paper proposes a formal theory for the underlying causal mechanisms. The aim is to not only provide an explicit, testable hypothesis for the source of the observed modes of behaviour-but also to exemplify the contribution that formal theory building can make to understanding complex social phenomena. Mapping reveals the importance of geometry to the Avalanche game; each player has a pair of balancing loops, one involved in lowering the object, the other ensuring contact. For more players, sets of balancing loops interact and these can allow dominance by reinforcing loops, causing the system to chase upwards towards an ever-increasing goal. However, a series of other effects concerning human physiology and behaviour (HPB) is posited as playing a role. The hypothesis is therefore rigorously tested using simulation. For simplicity a 'One Degree of Freedom' case is examined, allowing all of the effects to be included whilst rendering the analysis more transparent. Formulation and experimentation with the model gives insight into the behaviours. Multi-dimensional rate/level analysis indicates that there is only a narrow region in which the system is able to move downwards. Model runs reproduce the single 'desired' mode of behaviour and all three of the observed 'problematic' ones. Sensitivity analysis gives further insight into the system's modes and their causes. Behaviour is seen to arise only when the geometric effects apply (number of players greater than degrees of freedom of object) in combination with a range of HPB effects. An analogy exists between the co-operative behaviour required here and various examples: conflicting strategic objectives in organizations; Prisoners' Dilemma and integrated bargaining situations. Additionally, the game may be relatable in more direct algebraic terms to situations involving companies in which the resulting behaviours are mediated by market regulations. Finally, comment is offered on the inadequacy of some forms of theory building and the case is made for formal theory building involving the use of models, analysis and plausible explanations to create deep understanding of social phenomena.
Resumo:
The Evaporative Fraction (EF) and the Complementary Relationship (CR), both extensively explored by Wilfried Brutsaert during his productive career, have elucidated the conceptual understanding of evapotranspiration within hydrological science, despite a lack of rigorous proof of validity of either concept. We briefly review Brutsaert's role in the history of these concepts and discuss their appeal and interrelationship.
Resumo:
PURPOSE: The aim of this study is to implement augmented reality in real-time image-guided interstitial brachytherapy to allow an intuitive real-time intraoperative orientation. METHODS AND MATERIALS: The developed system consists of a common video projector, two high-resolution charge coupled device cameras, and an off-the-shelf notebook. The projector was used as a scanning device by projecting coded-light patterns to register the patient and superimpose the operating field with planning data and additional information in arbitrary colors. Subsequent movements of the nonfixed patient were detected by means of stereoscopically tracking passive markers attached to the patient. RESULTS: In a first clinical study, we evaluated the whole process chain from image acquisition to data projection and determined overall accuracy with 10 patients undergoing implantation. The described method enabled the surgeon to visualize planning data on top of any preoperatively segmented and triangulated surface (skin) with direct line of sight during the operation. Furthermore, the tracking system allowed dynamic adjustment of the data to the patient's current position and therefore eliminated the need for rigid fixation. Because of soft-part displacement, we obtained an average deviation of 1.1 mm by moving the patient, whereas changing the projector's position resulted in an average deviation of 0.9 mm. Mean deviation of all needles of an implant was 1.4 mm (range, 0.3-2.7 mm). CONCLUSIONS: The developed low-cost augmented-reality system proved to be accurate and feasible in interstitial brachytherapy. The system meets clinical demands and enables intuitive real-time intraoperative orientation and monitoring of needle implantation.
Resumo:
Research into intuitive problem solving has shown that objective closeness of participants' hypotheses were closer to the accurate solution than their subjective ratings of closeness. After separating conceptually intuitive problem solving from the solutions of rational incremental tasks and of sudden insight tasks, we replicated this finding by using more precise measures in a conceptual problem-solving task. In a second study, we distinguished performance level, processing style, implicit knowledge and subjective feeling of closeness to the solution within the problem-solving task and examined the relationships of these different components with measures of intelligence and personality. Verbal intelligence correlated with performance level in problem solving, but not with processing style and implicit knowledge. Faith in intuition, openness to experience, and conscientiousness correlated with processing style, but not with implicit knowledge. These findings suggest that one needs to decompose processing style and intuitive components in problem solving to make predictions on effects of intelligence and personality measures.