909 resultados para ambiguity aversion
Resumo:
In this work we solve the uncalibrated photometric stereo problem with lights placed near the scene. We investigate different image formation models and find the one that best fits our observations. Although the devised model is more complex than its far-light counterpart, we show that under a global linear ambiguity the reconstruction is possible up to a rotation and scaling, which can be easily fixed. We also propose a solution for reconstructing the normal map, the albedo, the light positions and the light intensities of a scene given only a sequence of near-light images. This is done in an alternating minimization framework which first estimates both the normals and the albedo, and then the light positions and intensities. We validate our method on real world experiments and show that a near-light model leads to a significant improvement in the surface reconstruction compared to the classic distant illumination case.
Resumo:
Complexity has long been recognized and is increasingly becoming mainstream in geomorphology. However, the relative novelty of various concepts and techniques associated to it means that ambiguity continues to surround complexity. In this commentary, we present and discuss a variety of recent contributions that have the potential to help clarify issues and advance the use of complexity in geomorphology.
Resumo:
Since the mid-1980s, migrants from North African and sub-Saharan countries have irregularly crossed the Strait of Gibraltar, in the hope of a better future for themselves and their families. Travelling in small, poorly equipped boats without experienced captains has cost the lives of myriad border crossers. Many of these bodies will never be recovered and the bereaved will never know whether their relatives and friends are alive or not. The bereaved are thus condemned to a state of not knowing and uncertainty. Exploring the junction of death and belonging, I firstly open a discussion about the enigmatic relation between a dead body and a dead person and argue for the importance of the physical presence of the body for mourning. Secondly, I show how the anonymity of dead border crossers and their uncertain belongings are generated, concealed, or rewritten. Following the story of an undertaker, I thirdly examine post-mortem border crossings. Depicting the power relations within identification processes, I outline the ambiguity of the term belonging by emphasising the constitutive significance of personal belongings such as clothes to restore a person’s identity. Reflecting on the ethical relationships which different actors (including the researcher) undertake with the deceased, I aim at gaining a better understanding of the multiple belonging of dead border crossers found on Spanish shores.
Resumo:
We demonstrate a new ultrafast pulse reconstruction modality that is somewhat reminiscent of frequency-resolved optical gating but uses a modified setup and a conceptually different reconstruction algorithm that is derived from ptychography. Even though it is a second-order correlation scheme, it shows no time ambiguity. Moreover, the number of spectra to record is considerably smaller than in most other related schemes which, together with a robust algorithm, leads to extremely fast convergence of the reconstruction.
Resumo:
Bargaining is the building block of many economic interactions, ranging from bilateral to multilateral encounters and from situations in which the actors are individuals to negotiations between firms or countries. In all these settings, economists have been intrigued for a long time by the fact that some projects, trades or agreements are not realized even though they are mutually beneficial. On the one hand, this has been explained by incomplete information. A firm may not be willing to offer a wage that is acceptable to a qualified worker, because it knows that there are also unqualified workers and cannot distinguish between the two types. This phenomenon is known as adverse selection. On the other hand, it has been argued that even with complete information, the presence of externalities may impede efficient outcomes. To see this, consider the example of climate change. If a subset of countries agrees to curb emissions, non-participant regions benefit from the signatories’ efforts without incurring costs. These free riding opportunities give rise to incentives to strategically improve ones bargaining power that work against the formation of a global agreement. This thesis is concerned with extending our understanding of both factors, adverse selection and externalities. The findings are based on empirical evidence from original laboratory experiments as well as game theoretic modeling. On a very general note, it is demonstrated that the institutions through which agents interact matter to a large extent. Insights are provided about which institutions we should expect to perform better than others, at least in terms of aggregate welfare. Chapters 1 and 2 focus on the problem of adverse selection. Effective operation of markets and other institutions often depends on good information transmission properties. In terms of the example introduced above, a firm is only willing to offer high wages if it receives enough positive signals about the worker’s quality during the application and wage bargaining process. In Chapter 1, it will be shown that repeated interaction coupled with time costs facilitates information transmission. By making the wage bargaining process costly for the worker, the firm is able to obtain more accurate information about the worker’s type. The cost could be pure time cost from delaying agreement or cost of effort arising from a multi-step interviewing process. In Chapter 2, I abstract from time cost and show that communication can play a similar role. The simple fact that a worker states to be of high quality may be informative. In Chapter 3, the focus is on a different source of inefficiency. Agents strive for bargaining power and thus may be motivated by incentives that are at odds with the socially efficient outcome. I have already mentioned the example of climate change. Other examples are coalitions within committees that are formed to secure voting power to block outcomes or groups that commit to different technological standards although a single standard would be optimal (e.g. the format war between HD and BlueRay). It will be shown that such inefficiencies are directly linked to the presence of externalities and a certain degree of irreversibility in actions. I now discuss the three articles in more detail. In Chapter 1, Olivier Bochet and I study a simple bilateral bargaining institution that eliminates trade failures arising from incomplete information. In this setting, a buyer makes offers to a seller in order to acquire a good. Whenever an offer is rejected by the seller, the buyer may submit a further offer. Bargaining is costly, because both parties suffer a (small) time cost after any rejection. The difficulties arise, because the good can be of low or high quality and the quality of the good is only known to the seller. Indeed, without the possibility to make repeated offers, it is too risky for the buyer to offer prices that allow for trade of high quality goods. When allowing for repeated offers, however, at equilibrium both types of goods trade with probability one. We provide an experimental test of these predictions. Buyers gather information about sellers using specific price offers and rates of trade are high, much as the model’s qualitative predictions. We also observe a persistent over-delay before trade occurs, and this mitigates efficiency substantially. Possible channels for over-delay are identified in the form of two behavioral assumptions missing from the standard model, loss aversion (buyers) and haggling (sellers), which reconcile the data with the theoretical predictions. Chapter 2 also studies adverse selection, but interaction between buyers and sellers now takes place within a market rather than isolated pairs. Remarkably, in a market it suffices to let agents communicate in a very simple manner to mitigate trade failures. The key insight is that better informed agents (sellers) are willing to truthfully reveal their private information, because by doing so they are able to reduce search frictions and attract more buyers. Behavior observed in the experimental sessions closely follows the theoretical predictions. As a consequence, costless and non-binding communication (cheap talk) significantly raises rates of trade and welfare. Previous experiments have documented that cheap talk alleviates inefficiencies due to asymmetric information. These findings are explained by pro-social preferences and lie aversion. I use appropriate control treatments to show that such consideration play only a minor role in our market. Instead, the experiment highlights the ability to organize markets as a new channel through which communication can facilitate trade in the presence of private information. In Chapter 3, I theoretically explore coalition formation via multilateral bargaining under complete information. The environment studied is extremely rich in the sense that the model allows for all kinds of externalities. This is achieved by using so-called partition functions, which pin down a coalitional worth for each possible coalition in each possible coalition structure. It is found that although binding agreements can be written, efficiency is not guaranteed, because the negotiation process is inherently non-cooperative. The prospects of cooperation are shown to crucially depend on i) the degree to which players can renegotiate and gradually build up agreements and ii) the absence of a certain type of externalities that can loosely be described as incentives to free ride. Moreover, the willingness to concede bargaining power is identified as a novel reason for gradualism. Another key contribution of the study is that it identifies a strong connection between the Core, one of the most important concepts in cooperative game theory, and the set of environments for which efficiency is attained even without renegotiation.
Resumo:
A recent stream of organizational research has used the term serious play to describe situations in which people engage in playful behaviors deliberately with the intention to achieve serious, work-related objectives. In this article, the authors reflect on the ambiguity of this term, and reframe serious play as a practice characterized by the paradox of intentionality (when actors engage deliberately in a fun, intrinsically motivating activity as a means to achieve a serious, extrinsically motivated work objective). This reframing not only extends the explanatory power of the concept of serious play but also helps bridge the concerns of scholars and practitioners: first, by enabling us to understand a variety of activities in organizations as serious play, which can help practitioners address specific organizational challenges; second, by recognizing the potential for emergent serious play, and the creation of the conditions to foster this emergence; third, by pointing toward specific, individual or group-level outcomes associated with the practice; and finally, by uncovering its ethical dimensions and encouraging the understanding of the role of serious play on ethical decision making.
Resumo:
Polymorbid patients, diverse diagnostic and therapeutic options, more complex hospital structures, financial incentives, benchmarking, as well as perceptional and societal changes put pressure on medical doctors, specifically if medical errors surface. This is particularly true for the emergency department setting, where patients face delayed or erroneous initial diagnostic or therapeutic measures and costly hospital stays due to sub-optimal triage. A "biomarker" is any laboratory tool with the potential better to detect and characterise diseases, to simplify complex clinical algorithms and to improve clinical problem solving in routine care. They must be embedded in clinical algorithms to complement and not replace basic medical skills. Unselected ordering of laboratory tests and shortcomings in test performance and interpretation contribute to diagnostic errors. Test results may be ambiguous with false positive or false negative results and generate unnecessary harm and costs. Laboratory tests should only be ordered, if results have clinical consequences. In studies, we must move beyond the observational reporting and meta-analysing of diagnostic accuracies for biomarkers. Instead, specific cut-off ranges should be proposed and intervention studies conducted to prove outcome relevant impacts on patient care. The focus of this review is to exemplify the appropriate use of selected laboratory tests in the emergency setting for which randomised-controlled intervention studies have proven clinical benefit. Herein, we focus on initial patient triage and allocation of treatment opportunities in patients with cardiorespiratory diseases in the emergency department. The following five biomarkers will be discussed: proadrenomedullin for prognostic triage assessment and site-of-care decisions, cardiac troponin for acute myocardial infarction, natriuretic peptides for acute heart failure, D-dimers for venous thromboembolism, C-reactive protein as a marker of inflammation, and procalcitonin for antibiotic stewardship in infections of the respiratory tract and sepsis. For these markers we provide an overview on physiopathology, historical evolution of evidence, strengths and limitations for a rational implementation into clinical algorithms. We critically discuss results from key intervention trials that led to their use in clinical routine and potential future indications. The rational for the use of all these biomarkers, first, tackle diagnostic ambiguity and consecutive defensive medicine, second, delayed and sub-optimal therapeutic decisions, and third, prognostic uncertainty with misguided triage and site-of-care decisions all contributing to the waste of our limited health care resources. A multifaceted approach for a more targeted management of medical patients from emergency admission to discharge including biomarkers, will translate into better resource use, shorter length of hospital stay, reduced overall costs, improved patients satisfaction and outcomes in terms of mortality and re-hospitalisation. Hopefully, the concepts outlined in this review will help the reader to improve their diagnostic skills and become more parsimonious laboratory test requesters.
Resumo:
In a marvelous but somewhat neglected paper, 'The Corporation: Will It Be Managed by Machines?' Herbert Simon articulated from the perspective of 1960 his vision of what we now call the New Economy the machine-aided system of production and management of the late twentieth century. Simon's analysis sprang from what I term the principle of cognitive comparative advantage: one has to understand the quite different cognitive structures of humans and machines (including computers) in order to explain and predict the tasks to which each will be most suited. Perhaps unlike Simon's better-known predictions about progress in artificial intelligence research, the predictions of this 1960 article hold up remarkably well and continue to offer important insights. In what follows I attempt to tell a coherent story about the evolution of machines and the division of labor between humans and machines. Although inspired by Simon's 1960 paper, I weave many other strands into the tapestry, from classical discussions of the division of labor to present-day evolutionary psychology. The basic conclusion is that, with growth in the extent of the market, we should see humans 'crowded into' tasks that call for the kinds of cognition for which humans have been equipped by biological evolution. These human cognitive abilities range from the exercise of judgment in situations of ambiguity and surprise to more mundane abilities in spatio-temporal perception and locomotion. Conversely, we should see machines 'crowded into' tasks with a well-defined structure. This conclusion is not based (merely) on a claim that machines, including computers, are specialized idiots-savants today because of the limits (whether temporary or permanent) of artificial intelligence; rather, it rests on a claim that, for what are broadly 'economic' reasons, it will continue to make economic sense to create machines that are idiots-savants.
Resumo:
With its turbulent and volatile legal evolution, the right to an abortion in the United States still remains a highly contested issue and has developed into one of the most divisive topics within modern legal discourse. By deconstructing the political underpinnings and legal rationale of the right to an abortion through a systematic case law analysis, I will demonstrate that this right has been incrementally destabilized. This instability embedded in abortion jurisprudence has been primarily produced by a combination of textual ambiguity in the case law and judicial ambivalence regarding this complex area of law. In addition, I argue that the use of the largely discredited substantive due process doctrine to ground this contentious right has also contributed to the lack of legal stability. I assert that when these elements culminate in the realm of reproductive privacy the right to terminate a pregnancy becomes increasingly unstable and contested.
Resumo:
The sensitivity of terrestrial environments to past changes in heat transport is expected to be manifested in Holocene climate proxy records on millennial to seasonal timescales. Stalagmite formation in the Okshola cave near Fauske (northern Norway) began at about 10.4 ka, soon after the valley was deglaciated. Past monitoring of the cave and surface has revealed stable modern conditions with uniform drip rates, relative humidity and temperature. Stable isotope records from two stalagmites provide time-series spanning from c. 10380 yr to AD 1997; a banded, multi-coloured stalagmite (Oks82) was formed between 10380 yr and 5050 yr, whereas a pristine, white stalagmite (FM3) covers the period from ~7500 yr to the present. The stable oxygen isotope (delta18Oc), stable carbon isotope (delta13Cc), and growth rate records are interpreted as showing i) a negative correlation between cave/surface temperature and delta18Oc, ii) a positive correlation between wetness and delta13Cc, and iii) a positive correlation between temperature and growth rate. Following this, the data from Okshola show that the Holocene was characterised by high-variability climate in the early part, low-variability climate in the middle part, and high-variability climate and shifts between two distinct modes in the late part. A total of nine Scandinavian stalagmite delta18Oc records of comparable dating precision are now available for parts or most of the Holocene. None of them show a clear Holocene thermal optimum, suggesting that they are influenced by annual mean temperature (cave temperature) rather than seasonal temperature. For the last 1000 years, delta18Oc values display a depletion-enrichment-depletion pattern commonly interpreted as reflecting the conventional view on climate development for the last millennium. Although the delta18Oc records show similar patterns and amplitudes of change, the main challenges for utilising high-latitude stalagmites as palaeoclimate archives are i) the accuracy of the age models, ii) the ambiguity of the proxy signals, and iii) calibration with monitoring data.
Resumo:
El trabajo tiene por objetivo identificar las características que distinguen al viajero de la segunda mitad del siglo XX de sus antepasados: el peregrino medieval, el colonizador, el viajero moderno. El marco teórico lo brindan las reflexiones sobre postmodernismo (Hutcheon, Lyotard, Nogerol, Vattimo) y las tipologías clásicas de viajeros según el comparatismo (Pageaux, Wolfzettel).Es la hipótesis que el escritor viajero de la segunda mitad del siglo XX, lejos de ser simplemente un turista masificado, viaja en soledad, bajo las demandas de su individualismo. Su texto sin compromisos ideológicos fracciona la realidad, la deconstruye usando la fina ironía, la contradicción y la ambigüedad. No se interesa por la originalidad de su escrito, sino que lo arma como un collage inter- y paratextual, y es leído ampliamente, porque pertenece a la sociedad postmoderna con sus marcas de Individualismo, valores materialistas, multiplicidad espacio-temporal, mediatización de la realidad, masificación de la cultura y consumismo. Así, el relato de viajes que tradicionalmente ha sido siempre un texto subjetivo, con estilo aditivo y que ha servido de intermediario entre culturas resulta una tipología textual apropiada para la expresión de las experiencias del viajero postmoderno.
Resumo:
El concepto de espacio público, cuya actual tendencia expansiva lo carga de abundante ambigüedad, a menudo es tratado en la geografía desde una perspectiva ineludible y sombría. Este conjunto de argumentos han conformado en estos últimos veinte años una retórica sobre la pérdida del espacio público, condenándolo a un destino poco prometedor. Proponemos abordarlo en la Ciudad de Mendoza desde una perspectiva posible y dinámica, donde el espacio público constantemente se esté rehaciendo y redefiniendo entre conflictos, disputas y acuerdos cotidianos en la sociedad, que adquieren prácticas socio-espaciales específicas.
Resumo:
En el presente artículo se presentan algunos lineamientos generales sobre la distinción sexo/género y su impacto en la delimitación de la categoría cuerpo en la teoría feminista. Luego se ofrecen argumentos que cuestionan el dimorfismo sexual en términos naturales, a partir de conceptualizaciones de Judith Butler, de la ambigüedad de cuerpos intersexuales y de ciertas prácticas corporales subversivas. En esta línea, se exponen aproximaciones esencialistas y constructivistas en relación con el cuerpo, ilustradas a partir de los planteos de Luce Irigaray y Judith Butler. Finalmente, se concluye la necesidad de someter a debate la categoría sexo como ocasión privilegiada para reformular las múltiples conceptualizaciones que involucran la dimensión del cuerpo
Resumo:
Los adultos, muy poco dispuestos a poner en evidencia sus debilidades, se muestran reacios a reconocer sus temores. Los niños y niñas desconociendo en gran medida, convenciones y prácticas sociales restrictivas de lo emocional, se han mostrado y se muestran más libres en aceptar y hablar de sus miedos. Este miedo, ya en sus experiencias individuales o colectivas, puede definirse como la aversión a alguien o alguna cosa, que provoca, invariablemente, incapacidades, restricciones de la acción en distinto grado. Nos hemos propuesto advertir este miedo infantil, no sólo con el objeto de individualizar sus causas y efectos, sino también confirmar su persistencia a lo largo de los años, así como advertir como estos condicionaron la existencia infantil de generaciones posteriores.
Resumo:
El objetivo de este trabajo es recuperar, por un lado, los aportes realizados por la nueva historia intelectual en torno a la noción de lenguajes políticos y, por el otro, los pilares teóricos de una concepción posfundacional de lo político. Para ello reconstruimos los presupuestos filosóficos que fundamentan ambas perspectivas, para luego indagar de qué modo cada una de ellas puede complementar las faltas de la otra. A partir de esta revisión, el artículo reconstruye las herramientas conceptuales para pensar una teoría política capaz de comprender la contingencia y la ambigüedad de nuestra historia política pasada y reciente. Esto quiere decir una teoría política capaz de indagar en los procesos de la construcción de sentido en los que lenguaje y política se articulan complejamente