404 resultados para Intuition.


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Executives must walk in a fine line between two ends: by a part, the arbitrary decision making without a rigorous study of the problematic situation, being based on hunches or intuition; on the other hand, to lean too much in rational and quantitative analyses. The decisions generally are not taken by a single executive, requires of the participation of certain groups involved in the problematic one and in this social sense, it is doubtless that, sometimes, conflict of interests coexists frustating the participating and collaborative context on the solution and implantation have arisen from decisions; whereas on the other hand, contexts arise where the collaboration is converted in consensus solutions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The industrial revolution and the subsequent industrialization of the economies occurred Orst in temperate regions. We argue that this and the associated positive correlation between absolute latitude and GDP per capita is due to the fact that countries located far from the equator suffered more profound seasonal auctuations in climate, namely stronger and longer winters. We propose a growth model of biased innovations that accounts for these facts and show that countries located in temperate regions were more likely to create or adopt capital intensive modes of production. The intuition behind this result is that savings are used to smooth consumption; therefore, in places where output auctuations are more profound, savings are bigger. Because the incentives to innovate depend on the relative supply factors, economies where savings are bigger are more likely to create or adopt capital intensive technologies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Los líderes organizacionales se deben enfrentar a retos ambientales del mundo de los negocios y diversas presiones que los ponen día a día en un alto riesgo ético. Sortear dichos riesgos ha demandado cambios sustanciales en las dinámicas de las organizaciones contemporáneas, por lo que las exigencias a los directivos de tomar decisiones acertadas en situaciones de alta complejidad moral son cada vez mayores. Estas decisiones involucran un comportamiento ético de quien las toma, lo cual a su vez está mediado por sus emociones.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In November 2008, Colombian authorities dismantled a network of Ponzi schemes, making hundreds of thousands of investors lose tens of millions of dollars throughout the country. Using original data on the geographical incidence of the Ponzi schemes, this paper estimates the impact of their break down on crime. We find that the crash of Ponzi schemes differentially exacerbated crime in affected districts. Confirming the intuition of the standard economic model of crime, this effect is only present in places with relatively weak judicial and law enforcement institutions, and with little access to consumption smoothing mechanisms such as microcredit. In addition, we show that, with the exception of economically-motivated felonies such as robbery, violent crime is not affected by the negative shock.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La present tesi, tot i que emmarcada dins de la teoria de les Mesures Semblança Molecular Quántica (MQSM), es deriva en tres àmbits clarament definits: - La creació de Contorns Moleculars de IsoDensitat Electrònica (MIDCOs, de l'anglès Molecular IsoDensity COntours) a partir de densitats electròniques ajustades. - El desenvolupament d'un mètode de sobreposició molecular, alternatiu a la regla de la màxima semblança. - Relacions Quantitatives Estructura-Activitat (QSAR, de l'anglès Quantitative Structure-Activity Relationships). L'objectiu en el camp dels MIDCOs és l'aplicació de funcions densitat ajustades, ideades inicialment per a abaratir els càlculs de MQSM, per a l'obtenció de MIDCOs. Així, es realitza un estudi gràfic comparatiu entre diferents funcions densitat ajustades a diferents bases amb densitats obtingudes de càlculs duts a terme a nivells ab initio. D'aquesta manera, l'analogia visual entre les funcions ajustades i les ab initio obtinguda en el ventall de representacions de densitat obtingudes, i juntament amb els valors de les mesures de semblança obtinguts prèviament, totalment comparables, fonamenta l'ús d'aquestes funcions ajustades. Més enllà del propòsit inicial, es van realitzar dos estudis complementaris a la simple representació de densitats, i són l'anàlisi de curvatura i l'extensió a macromolècules. La primera observació correspon a comprovar no només la semblança dels MIDCOs, sinó la coherència del seu comportament a nivell de curvatura, podent-se així observar punts d'inflexió en la representació de densitats i veure gràficament aquelles zones on la densitat és còncava o convexa. Aquest primer estudi revela que tant les densitats ajustades com les calculades a nivell ab initio es comporten de manera totalment anàloga. En la segona part d'aquest treball es va poder estendre el mètode a molècules més grans, de fins uns 2500 àtoms. Finalment, s'aplica part de la filosofia del MEDLA. Sabent que la densitat electrònica decau ràpidament al allunyar-se dels nuclis, el càlcul d'aquesta pot ser obviat a distàncies grans d'aquests. D'aquesta manera es va proposar particionar l'espai, i calcular tan sols les funcions ajustades de cada àtom tan sols en una regió petita, envoltant l'àtom en qüestió. Duent a terme aquest procés, es disminueix el temps de càlcul i el procés esdevé lineal amb nombre d'àtoms presents en la molècula tractada. En el tema dedicat a la sobreposició molecular es tracta la creació d'un algorisme, així com la seva implementació en forma de programa, batejat Topo-Geometrical Superposition Algorithm (TGSA), d'un mètode que proporcionés aquells alineaments que coincideixen amb la intuïció química. El resultat és un programa informàtic, codificat en Fortran 90, el qual alinea les molècules per parelles considerant tan sols nombres i distàncies atòmiques. La total absència de paràmetres teòrics permet desenvolupar un mètode de sobreposició molecular general, que proporcioni una sobreposició intuïtiva, i també de forma rellevant, de manera ràpida i amb poca intervenció de l'usuari. L'ús màxim del TGSA s'ha dedicat a calcular semblances per al seu ús posterior en QSAR, les quals majoritàriament no corresponen al valor que s'obtindria d'emprar la regla de la màxima semblança, sobretot si hi ha àtoms pesats en joc. Finalment, en l'últim tema, dedicat a la Semblança Quàntica en el marc del QSAR, es tracten tres aspectes diferents: - Ús de matrius de semblança. Aquí intervé l'anomenada matriu de semblança, calculada a partir de les semblances per parelles d'entre un conjunt de molècules. Aquesta matriu és emprada posteriorment, degudament tractada, com a font de descriptors moleculars per a estudis QSAR. Dins d'aquest àmbit s'han fet diversos estudis de correlació d'interès farmacològic, toxicològic, així com de diverses propietats físiques. - Aplicació de l'energia d'interacció electró-electró, assimilat com a una forma d'autosemblança. Aquesta modesta contribució consisteix breument en prendre el valor d'aquesta magnitud, i per analogia amb la notació de l'autosemblança molecular quàntica, assimilar-la com a cas particular de d'aquesta mesura. Aquesta energia d'interacció s'obté fàcilment a partir de programari mecanoquàntic, i esdevé ideal per a fer un primer estudi preliminar de correlació, on s'utilitza aquesta magnitud com a únic descriptor. - Càlcul d'autosemblances, on la densitat ha estat modificada per a augmentar el paper d'un substituent. Treballs previs amb densitats de fragments, tot i donar molt bons resultats, manquen de cert rigor conceptual en aïllar un fragment, suposadament responsable de l'activitat molecular, de la totalitat de l'estructura molecular, tot i que les densitats associades a aquest fragment ja difereixen degut a pertànyer a esquelets amb diferents substitucions. Un procediment per a omplir aquest buit que deixa la simple separació del fragment, considerant així la totalitat de la molècula (calcular-ne l'autosemblança), però evitant al mateix temps valors d'autosemblança no desitjats provocats per àtoms pesats, és l'ús de densitats de Forats de fermi, els quals es troben definits al voltant del fragment d'interès. Aquest procediment modifica la densitat de manera que es troba majoritàriament concentrada a la regió d'interès, però alhora permet obtenir una funció densitat, la qual es comporta matemàticament igual que la densitat electrònica regular, podent-se així incorporar dins del marc de la semblança molecular. Les autosemblances calculades amb aquesta metodologia han portat a bones correlacions amb àcids aromàtics substituïts, podent així donar una explicació al seu comportament. Des d'un altre punt de vista, també s'han fet contribucions conceptuals. S'ha implementat una nova mesura de semblança, la d'energia cinètica, la qual consisteix en prendre la recentment desenvolupada funció densitat d'energia cinètica, la qual al comportar-se matemàticament igual a les densitats electròniques regulars, s'ha incorporat en el marc de la semblança. A partir d'aquesta mesura s'han obtingut models QSAR satisfactoris per diferents conjunts moleculars. Dins de l'aspecte del tractament de les matrius de semblança s'ha implementat l'anomenada transformació estocàstica com a alternativa a l'ús de l'índex Carbó. Aquesta transformació de la matriu de semblança permet obtenir una nova matriu no simètrica, la qual pot ser posteriorment tractada per a construir models QSAR.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Se plantea que la crítica suele ignorar las colecciones de cuentos que Humberto Salvador publicara entre las décadas de 1930 y 1980. Salvador recrea, a lo largo de toda su producción, la sociedad y la cultura que lo rodean, buscando descifrar su esencia moral, para ello, otorga a la intuición un valor positivo, por sobre la razón. Entre los temas iniciales del autor guayaquileño (década de 1920), sobresale su reflexión sobre el arte, como producción inmersa en la cultura, la historia y la sociedad, una obra sería conjunción del trabajo literario y el azar de la vida cotidiana: por ello es un objeto por encontrarse. Después de este período «estético», Salvador buscó representar la sociedad, con personajes desclasados principalmente, que no se adhieren a ningún código ortodoxo. Hizo énfasis en aspectos psicológicos o en estados mentales (enfermedades como la esquizofrenia, la histeria, etc.). A decir del crítico Wilfrido H. Corral, los relatos del guayaquileño progresan del tema del artista hacia el de la cotidianidad, y de éste al del artista menos libre, son joyas, insiste, de la comedia existencial, de la angustia y de la moderación doméstica.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Matheron's usual variogram estimator can result in unreliable variograms when data are strongly asymmetric or skewed. Asymmetry in a distribution can arise from a long tail of values in the underlying process or from outliers that belong to another population that contaminate the primary process. This paper examines the effects of underlying asymmetry on the variogram and on the accuracy of prediction, and the second one examines the effects arising from outliers. Standard geostatistical texts suggest ways of dealing with underlying asymmetry; however, this is based on informed intuition rather than detailed investigation. To determine whether the methods generally used to deal with underlying asymmetry are appropriate, the effects of different coefficients of skewness on the shape of the experimental variogram and on the model parameters were investigated. Simulated annealing was used to create normally distributed random fields of different size from variograms with different nugget:sill ratios. These data were then modified to give different degrees of asymmetry and the experimental variogram was computed in each case. The effects of standard data transformations on the form of the variogram were also investigated. Cross-validation was used to assess quantitatively the performance of the different variogram models for kriging. The results showed that the shape of the variogram was affected by the degree of asymmetry, and that the effect increased as the size of data set decreased. Transformations of the data were more effective in reducing the skewness coefficient in the larger sets of data. Cross-validation confirmed that variogram models from transformed data were more suitable for kriging than were those from the raw asymmetric data. The results of this study have implications for the 'standard best practice' in dealing with asymmetry in data for geostatistical analyses. (C) 2007 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Three experiments examine whether simple pair-wise comparison judgments, involving the “recognition heuristic” (Goldstein & Gigerenzer, 2002), are sensitive to implicit cues to the nature of the comparison required. Experiments 1 & 2 show that participants frequently choose the recognized option of a pair if asked to make “larger” judgments but are significantly less likely to choose the unrecognized option when asked to make “smaller” judgments. Experiment 3 demonstrates that, overall, participants consider recognition to be a more reliable guide to judgments of a magnitude criterion than lack of recognition and that this intuition drives the framing effect. These results support the idea that, when making pair-wise comparison judgments, inferring that the recognized item is large is simpler than inferring that the unrecognized item is small.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The objective of this paper is to revisit the von Liebig hypothesis by reexamining five samples of experimental data and by applying to it recent advances in Bayesian techniques. The samples were published by Hexem and Heady as described in a further section. Prior to outlining the estimation strategy, we discuss the intuition underlying our approach and, briefly, the literature on which it is based. We present an algorithm for the basic von Liebig formulation and demonstrate its application using simulated data (table 1). We then discuss the modifications needed to the basic model that facilitate estimation of a von Liebig frontier and we demonstrate the extended algorithm using simulated data (table 2). We then explore, empirically, the relationships between limiting water and nitrogen in the Hexem and Heady corn samples and compare the results between the two formulations (table 3). Finally, some conclusions and suggestions for further research are offered.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A review of current risk pricing practices in the financial, insurance and construction sectors is conducted through a comprehensive literature review. The purpose was to inform a study on risk and price in the tendering processes of contractors: specifically, how contractors take account of risk when they are calculating their bids for construction work. The reference to mainstream literature was in view of construction management research as a field of application rather than a fundamental academic discipline. Analytical models are used for risk pricing in the financial sector. Certain mathematical laws and principles of insurance are used to price risk in the insurance sector. construction contractors and practitioners are described to traditionally price allowances for project risk using mechanisms such as intuition and experience. Project risk analysis models have proliferated in recent years. However, they are rarely used because of problems practitioners face when confronted with them. A discussion of practices across the three sectors shows that the construction industry does not approach risk according to the sophisticated mechanisms of the two other sectors. This is not a poor situation in itself. However, knowledge transfer from finance and insurance can help construction practitioners. But also, formal risk models for contractors should be informed by the commercial exigencies and unique characteristics of the construction sector.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recently a substantial amount of research has been done in the field of dextrous manipulation and hand manoeuvres. The main concern has been how to control robot hands so that they can execute manipulation tasks with the same dexterity and intuition as human hands. This paper surveys multi-fingered robot hand research and development topics which include robot hand design, object force distribution and control, grip transform, grasp stability and its synthesis, grasp stiffness and compliance motion and robot arm-hand coordination. Three main topics are presented in this article. The first is an introduction to the subject. The second concentrates on examples of mechanical manipulators used in research and the methods employed to control them. The third presents work which has been done on the field of object manipulation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Programming is a skill which requires knowledge of both the basic constructs of the computer language used and techniques employing these constructs. How these are used in any given application is determined intuitively, and this intuition is based on experience of programs already written. One aim of this book is to describe the techniques and give practical examples of the techniques in action - to provide some experience. Another aim of the book is to show how a program should be developed, in particular how a relatively large program should be tackled in a structured manner. These aims are accomplished essentially by describing the writing of one large program, a diagram generator package, in which a number of useful programming techniques are employed. Also, the book provides a useful program, with an in-built manual describing not only how the program works, but also how it does it, with full source code listings. This means that the user can, if required, modify the package to meet particular requirements. A floppy disk is available from the publishers containing the program, including listings of the source code. All the programs are written in Modula-2, using JPI's Top Speed Modula-2 system running on IBM-PCs and compatibles. This language was chosen as it is an ideal language for implementing large programs and it is the main language taught in the Cybernetics Department at the University of Reading. There are some aspects of the Top Speed implementation which are not standard, so suitable comments are given when these occur. Although implemented in Modula-2, many of the techniques described here are appropriate to other languages, like Pascal of C, for example. The book and programs are based on a second year undergraduate course taught at Reading to Cybernetics students, entitled Algorithms and Data Structures. Useful techniques are described for the reader to use, applications where they are appropriate are recommended, but detailed analyses of the techniques are not given.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper the authors exploit two equivalent formulations of the average rate of material entropy production in the climate system to propose an approximate splitting between contributions due to vertical and eminently horizontal processes. This approach is based only on 2D radiative fields at the surface and at the top of atmosphere. Using 2D fields at the top of atmosphere alone, lower bounds to the rate of material entropy production and to the intensity of the Lorenz energy cycle are derived. By introducing a measure of the efficiency of the planetary system with respect to horizontal thermodynamic processes, it is possible to gain insight into a previous intuition on the possibility of defining a baroclinic heat engine extracting work from the meridional heat flux. The approximate formula of the material entropy production is verified and used for studying the global thermodynamic properties of climate models (CMs) included in the Program for Climate Model Diagnosis and Intercomparison (PCMDI)/phase 3 of the Coupled Model Intercomparison Project (CMIP3) dataset in preindustrial climate conditions. It is found that about 90% of the material entropy production is due to vertical processes such as convection, whereas the large-scale meridional heat transport contributes to only about 10% of the total. This suggests that the traditional two-box models used for providing a minimal representation of entropy production in planetary systems are not appropriate, whereas a basic—but conceptually correct—description can be framed in terms of a four-box model. The total material entropy production is typically 55 mW m−2 K−1, with discrepancies on the order of 5%, and CMs’ baroclinic efficiencies are clustered around 0.055. The lower bounds on the intensity of the Lorenz energy cycle featured by CMs are found to be around 1.0–1.5 W m−2, which implies that the derived inequality is rather stringent. When looking at the variability and covariability of the considered thermodynamic quantities, the agreement among CMs is worse, suggesting that the description of feedbacks is more uncertain. The contributions to material entropy production from vertical and horizontal processes are positively correlated, so that no compensation mechanism seems in place. Quite consistently among CMs, the variability of the efficiency of the system is a better proxy for variability of the entropy production due to horizontal processes than that of the large-scale heat flux. The possibility of providing constraints on the 3D dynamics of the fluid envelope based only on 2D observations of radiative fluxes seems promising for the observational study of planets and for testing numerical models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Consent's capacity to legitimise actions and claims is limited by conditions such as coercion, which render consent ineffective. A better understanding of the limits to consent's capacity to legitimise can shed light on a variety of applied debates, in political philosophy, bioethics, economics and law. I show that traditional paternalist explanations for limits to consent's capacity to legitimise cannot explain the central intuition that consent is often rendered ineffective when brought about by a rights violation or threatened rights violation. I argue that this intuition is an expression of the same principles of corrective justice that underlie norms of compensation and rectification. I show how these principles can explain and clarify core intuitions about conditions which render consent ineffective, including those concerned with the consenting agent's option set, his mental competence, and available information.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We introduce an algorithm (called REDFITmc2) for spectrum estimation in the presence of timescale errors. It is based on the Lomb-Scargle periodogram for unevenly spaced time series, in combination with the Welch's Overlapped Segment Averaging procedure, bootstrap bias correction and persistence estimation. The timescale errors are modelled parametrically and included in the simulations for determining (1) the upper levels of the spectrum of the red-noise AR(1) alternative and (2) the uncertainty of the frequency of a spectral peak. Application of REDFITmc2 to ice core and stalagmite records of palaeoclimate allowed a more realistic evaluation of spectral peaks than when ignoring this source of uncertainty. The results support qualitatively the intuition that stronger effects on the spectrum estimate (decreased detectability and increased frequency uncertainty) occur for higher frequencies. The surplus information brought by algorithm REDFITmc2 is that those effects are quantified. Regarding timescale construction, not only the fixpoints, dating errors and the functional form of the age-depth model play a role. Also the joint distribution of all time points (serial correlation, stratigraphic order) determines spectrum estimation.