718 resultados para Intuitive
Resumo:
La concepción de la notitia intuitiva, tal como la entienden los filósofos del siglo XIV, constituye un modo de acceso a la realidad de cuño agustiniano que subsiste a pesar del progresivo avance de la teoría aristotélica de la abstracción. Tomando como punto de partida la notitia intuitiva, Ockham propone una construcción del conocimiento y de la ciencia que procura una interpretación más delimitada y precisa del pensamiento de Duns Escoto. Esta doctrina ockhamista va a ser puesta a prueba por un desarrollo paradojal vinculado al conocimiento de lo no-existente. La importancia del ejemplo ha sido clave a la hora de la valoración de su gnoseología. En particular, el tema es expuesto, con algunos matices de diferenciación, en tres obras: en la quaestio primera del Prólogo del Commentarium in Sententiis; en las quaestiones 12-13 de la Reportatio II y en las Quaestiones Quodlibetales V y VI, a las que también se suman precisiones puntualizadas en el marco de sus composiciones físicas, en particular, en las Quaestiones Physicorum. Constituye nuestro propósito el análisis de estas fuentes en orden a discernir la importancia que el controvertido paso asume en el marco de la gnoseología ockhamista. El ejemplo ha sido interpretado de modo contrastante: para algunos introduce claramente una concepción escéptica de la doctrina del conocimiento, en el sentido que, una vez admitida la posibilidad de una intuición de cosas que no existen, no tenemos ningún criterio para establecer cuándo nuestro conocimiento es objetivo o subjetivo. Para otros, en cambio, debe ser entendido como una hipótesis que no interfiere sobre el plano natural del conocimiento. El recurso a las fuentes nos permitirá comparar los planos de explicitación del ejemplo aludido y determinar su contexto y alcance en orden a un examen del problema en el doble ámbito de lo natural y de lo sobrenatural, aportando nuestras propias conclusiones al caso.
Resumo:
Coastal managers require reliable spatial data on the extent and timing of potential coastal inundation, particularly in a changing climate. Most sea level rise (SLR) vulnerability assessments are undertaken using the easily implemented bathtub approach, where areas adjacent to the sea and below a given elevation are mapped using a deterministic line dividing potentially inundated from dry areas. This method only requires elevation data usually in the form of a digital elevation model (DEM). However, inherent errors in the DEM and spatial analysis of the bathtub model propagate into the inundation mapping. The aim of this study was to assess the impacts of spatially variable and spatially correlated elevation errors in high-spatial resolution DEMs for mapping coastal inundation. Elevation errors were best modelled using regression-kriging. This geostatistical model takes the spatial correlation in elevation errors into account, which has a significant impact on analyses that include spatial interactions, such as inundation modelling. The spatial variability of elevation errors was partially explained by land cover and terrain variables. Elevation errors were simulated using sequential Gaussian simulation, a Monte Carlo probabilistic approach. 1,000 error simulations were added to the original DEM and reclassified using a hydrologically correct bathtub method. The probability of inundation to a scenario combining a 1 in 100 year storm event over a 1 m SLR was calculated by counting the proportion of times from the 1,000 simulations that a location was inundated. This probabilistic approach can be used in a risk-aversive decision making process by planning for scenarios with different probabilities of occurrence. For example, results showed that when considering a 1% probability exceedance, the inundated area was approximately 11% larger than mapped using the deterministic bathtub approach. The probabilistic approach provides visually intuitive maps that convey uncertainties inherent to spatial data and analysis.
Resumo:
The Bounty Trough, east of New Zealand, lies along the southeastern edge of the present-day Subtropical Front (STF), and is a major conduit via the Bounty Channel, for terrigenous sediment supply from the uplifted Southern Alps to the abyssal Bounty Fan. Census data on 65 benthic foraminiferal faunas (>63 µm) from upper bathyal (ODP 1119), lower bathyal (DSDP 594) and abyssal (ODP 1122) sequences, test and refine existing models for the paleoceanographic and sedimentary history of the trough through the last 150 ka (marine isotope stages, MIS 6-1). Cluster analysis allows recognition of six species groups, whose distribution patterns coincide with bathymetry, the climate cycles and displaced turbidite beds. Detrended canonical correspondence analysis and comparisons with modern faunal patterns suggest that the groups are most strongly influenced by food supply (organic carbon flux), and to a lesser extent by bottom water oxygen and factors relating to sediment type. Major faunal changes at upper bathyal depths (1119) probably resulted from cycles of counter-intuitive seaward-landward migrations of the Southland Front (SF) (north-south sector of the STF). Benthic foraminiferal changes suggest that lower nutrient, cool Subantarctic Surface Water (SAW) was overhead in warm intervals, and higher nutrient-bearing, warm neritic Subtropical Surface Water (STW) was overhead in cold intervals. At lower bathyal depths (594), foraminiferal changes indicate increased glacial productivity and lowered bottom oxygen, attributed to increased upwelling and inflow of cold, nutrient-rich, Antarctic Intermediate Water (AAIW) and shallowing of the oxygen-minimum zone (upper Circum Polar Deep Water, CPDW). The observed cyclical benthic foraminiferal changes are not a result of associations migrating up and down the slope, as glacial faunas (dominated by Globocassidulina canalisuturata and Eilohedra levicula at upper and lower bathyal depths, respectively) are markedly different from those currently living in the Bounty Trough. On the abyssal Bounty Fan (1122), faunal changes correlate most strongly with grain size, and are attributed to varying amounts of mixing of displaced and in-situ faunas. Most of the displaced foraminifera in turbiditic sand beds are sourced from mid-outer shelf depths at the head of the Bounty Channel. Turbidity currents were more prevalent during, but not restricted to, glacial intervals.
Resumo:
The isotopic composition of surface seawater is widely used to infer past changes in sea surface salinity using paired foraminiferal Mg/Ca and d18O from marine sediments. At low latitudes, paleosalinity reconstructions using this method have largely been used to document changes in the hydrological cycle. This method usually assumes that the modern seawater d18O (d18Osw)/salinity relationship remained constant through time. Modelling studies have shown that such assumptions may not be valid because large-scale atmospheric circulation patterns linked to global climate changes can alter the seawater d18Osw/salinity relationship locally. Such processes have not been evidenced by paleo-data so far because there is presently no way to reconstruct past changes in the seawater d18Osw/salinity relationship. We have addressed this issue by applying a multi-proxy salinity reconstruction from a marine sediment core collected in the Gulf of Guinea. We measured hydrogen isotopes in C37:2 alkenones (dDa) to estimate changes in seawater dD. We find a smooth, long-term increase of ~10 per mil in dDa between 10 and 3 kyr BP, followed by a rapid decrease of ~10 per mil in dDa between 3 kyr BP and core top to values slightly lighter than during the early Holocene. Those features are inconsistent with published salinity estimations based on d18Osw and foraminiferal Ba/Ca, as well as nearby continental rainfall history derived from pollen analysis. We combined dDa and d18Osw values to reconstruct a Holocene record of salinity and compared it to a Ba/Ca-derived salinity record from the same sedimentary sequence. This combined method provides salinity trends that are in better agreement with both the Ba/Ca-derived salinity and the regional precipitation changes as inferred from pollen records. Our results illustrate that changes in atmospheric circulation can trigger changes in precipitation isotopes in a counter-intuitive manner that ultimately impacts surface salinity estimates based on seawater isotopic values. Our data suggest that the trends in Holocene rainfall isotopic values at low latitudes may not uniquely result from changes in local precipitation associated with the amount effect.
Resumo:
This paper presents a new tool for large-area photo-mosaicking (LAPM tool). This tool was developed specifically for the purpose of underwater mosaicking, and it is aimed at providing end-user scientists with an easy and robust way to construct large photo-mosaics from any set of images. It is notably capable of constructing mosaics with an unlimited number of images on any modern computer (minimum 1.30 GHz, 2 GB RAM). The mosaicking process can rely on both feature matching and navigation data. This is complemented by an intuitive graphical user interface, which gives the user the ability to select feature matches between any pair of overlapping images. Finally, mosaic files are given geographic attributes that permit direct import into ArcGIS. So far, the LAPM tool has been successfully used to construct geo-referenced photo-mosaics with photo and video material from several scientific cruises. The largest photo-mosaic contained more than 5000 images for a total area of about 105,000 m**2. This is the first article to present and to provide a finished and functional program to construct large geo-referenced photo-mosaics of the seafloor using feature detection and matching techniques. It also presents concrete examples of photo-mosaics produced with the LAPM tool.
Resumo:
Increasing amounts of data is collected in most areas of research and application. The degree to which this data can be accessed, analyzed, and retrieved, is a decisive in obtaining progress in fields such as scientific research or industrial production. We present a novel methodology supporting content-based retrieval and exploratory search in repositories of multivariate research data. In particular, our methods are able to describe two-dimensional functional dependencies in research data, e.g. the relationship between ination and unemployment in economics. Our basic idea is to use feature vectors based on the goodness-of-fit of a set of regression models to describe the data mathematically. We denote this approach Regressional Features and use it for content-based search and, since our approach motivates an intuitive definition of interestingness, for exploring the most interesting data. We apply our method on considerable real-world research datasets, showing the usefulness of our approach for user-centered access to research data in a Digital Library system.
Resumo:
The concept and logic of the "smile curve" in the context of global value chains has been widely used and discussed at the individual firm level, but rarely identified and investigated at the country and industry levels by using real data. This paper proposes an idea, based on an inter-country input-output model, to consistently measure both the strength and length of linkages between producers and consumers along global value chains. This idea allows for better identification and mapping of smile curves for countries and industries according to their positions and degrees of participation in a given conceptual value chain. Using the 1995-2011 World Input-Output Tables, several conceptual value chains are investigated, including exports of electrical and optical equipment from China and Mexico and exports of automobiles from Japan and Germany. The identified smile curves provide a very intuitive and visual image, which can significantly improve our understanding of the roles played by different countries and industries in global value chains. Further, the smile curves help identify the benefits gained by these countries and industries through their participation in global trade.
Resumo:
Interaction with smart objects can be accomplished with different technologies, such as tangible interfaces or touch computing, among others. Some of them require the object to be especially designed to be 'smart', and some other are limited in the variety and complexity of the possible actions. This paper describes a user-smart object interaction model and prototype based on the well known event-condition-action (ECA) reasoning, which can work, to a degree, independently of the intelligence embedded into the smart object. It has been designed for mobile devices to act as mediators between users and smart objects and provides an intuitive means for personalization of object's behavior. When the user is close to an object, this one publishes its 'event & action' capabilities to the user's device. The user may accept the object's module offering, which will enable him to configure and control that object, but also its actions with respect to other elements of the environment or the virtual world. The modular ECA interaction model facilitates the integration of different types of objects in a smart space, giving the user full control of their capabilities and facilitating creative mash-uping to build customized functionalities that combine physical and virtual actions
Resumo:
The objective of this paper is to address the methodological process of a teaching strategy for training project management complexity in postgraduate programs. The proposal is made up of different methods —intuitive, comparative, deductive, case study, problem-solving Project-Based Learning— and different activities inside and outside the classroom. This integration of methods motivated the current use of the concept of “learning strategy”. The strategy has two phases: firstly, the integration of the competences —technical, behavioral and contextual—in real projects; and secondly, the learning activity was oriented in upper level of knowledge, the evaluating the complexity for projects management in real situations. Both the competences in the learning strategy and the Project Complexity Evaluation are based on the ICB of IPMA. The learning strategy is applied in an international Postgraduate Program —Erasmus Mundus Master of Science— with the participation of five Universities of the European Union. This master program is fruit of a cooperative experience from one Educative Innovation Group of the UPM -GIE-Project-, two Research Groups of the UPM and the collaboration with other external agents to the university. Some reflections on the experience and the main success factors in the learning strategy were presented in the paper
Resumo:
The objective of this paper is to address the methodological process of a teaching strategy for training project management complexity in postgraduate programs. The proposal is made up of different methods —intuitive, comparative, deductive, case study, problem-solving Project-Based Learning— and different activities inside and outside the classroom. This integration of methods motivated the current use of the concept of ―learning strategy‖. The strategy has two phases: firstly, the integration of the competences —technical, behavioral and contextual—in real projects; and secondly, the learning activity was oriented in upper level of knowledge, the evaluating the complexity for projects management in real situations. Both the competences in the learning strategy and the Project Complexity Evaluation are based on the ICB of IPMA. The learning strategy is applied in an international Postgraduate Program —Erasmus Mundus Master of Science— with the participation of five Universities of the European Union. This master program is fruit of a cooperative experience from one Educative Innovation Group of the UPM -GIE-Project-, two Research Groups of the UPM and the collaboration with other external agents to the university. Some reflections on the experience and the main success factors in the learning strategy were presented in the paper.
Resumo:
Este proyecto, titulado “Caracterización de colectores para concentración fotovoltaica”, consiste en una aplicación en Labview para obtener las características de los elementos ópticos utilizados en sistemas de concentración fotovoltaica , atendiendo a la distribución espacial del foco de luz concentrado que generan. Un sistema de concentración fotovoltaica utiliza un sistema óptico para transmitir la radiación luminosa a la célula solar aumentando la densidad de potencia luminosa. Estos sistemas ópticos están formados por espejos o lentes para recoger la radiación incidente en ellos y concentrar el haz de luz en una superficie mucho menor. De esta manera se puede reducir el área de material semiconductor necesario, lo que conlleva una importante reducción del coste del sistema. Se pueden distinguir diferentes sistemas de concentración dependiendo de la óptica que emplee, la estructura del receptor o el rango de concentración. Sin embargo, ya que el objetivo es analizar la distribución espacial, diferenciaremos dos tipos de concentradores dependiendo de la geometría que presenta el foco de luz. El concentrador lineal o cilíndrico que enfoca sobre una línea, y el concentrador de foco puntual o circular que enfoca la luz sobre un punto. Debido a esta diferencia el análisis en ambos casos se realizará de forma distinta. El análisis se realiza procesando una imagen del foco tomada en el lugar del receptor, este método se llama LS-CCD (Difusión de luz y captura con CCD). Puede utilizarse en varios montajes dependiendo si se capta la imagen por reflexión o por transmisión en el receptor. En algunos montajes no es posible captar la imagen perpendicular al receptor por lo que la aplicación realizará un ajuste de perspectiva para obtener el foco con su forma original. La imagen del foco ofrece información detallada acerca de la uniformidad del foco mediante el mapa de superficie, que es una representación en 3D de la imagen pero que resulta poco manejable. Una representación más sencilla y útil es la que ofrecen los llamados “perfiles de intensidad”. El perfil de intensidad o distribución de la irradiancia que representa la distribución de la luz para cada distancia al centro, y el perfil acumulado o irradiancia acumulada que representa la luz contenida en relación también al centro. Las representaciones de estos perfiles en el caso de un concentrador lineal y otro circular son distintas debido a su diferente geometría. Mientras que para un foco lineal se expresa el perfil en función de la semi-anchura del receptor, para uno circular se expresa en función del radio. En cualquiera de los casos ofrecen información sobre la uniformidad y el tamaño del foco de luz necesarios para diseñar el receptor. El objetivo de este proyecto es la creación de una aplicación software que realice el procesado y análisis de las imágenes obtenidas del foco de luz de los sistemas ópticos a caracterizar. La aplicación tiene una interfaz sencilla e intuitiva para que pueda ser empleada por cualquier usuario. Los recursos necesarios para realizar el proyecto son: un PC con sistema operativo Windows, el software Labview 8.6 Professional Edition y los módulos NI Vision Development Module (para trabajar con imágenes) y NI Report Generation Toolkit (para realizar reportes y guardar datos de la aplicación). ABSTRACT This project, called “Characterization of collectors for concentration photovoltaic systems”, consists in a Labview application to obtain the characteristics of the optical elements used in photovoltaic concentrator, taking into account the spatial distribution of concentrated light source generated. A concentrator photovoltaic system uses an optical system to transmit light radiation to the solar cell by increasing the light power density. This optical system are formed by mirrors or lenses to collect the radiation incident on them and focus the beam of light in a much smaller surface area. In this way you can reduce the area of semiconductor material needed, which implies a significant reduction in system cost. There are different concentration systems depending on the optics used, receptor structure or concentration range. However, as the aim is to analyze the spatial distribution, distinguish between two types of concentrators depending on the geometry that has the light focus. The linear or cylindrical concentrator that focused on a line, and the circular concentrator that focused light onto a point. Because this difference in both cases the analysis will be carried out differently. The analysis is performed by processing a focus image taken at the receiver site, this method is called “LS-CCD” (Light Scattering and CCD recording). Can be used in several mountings depending on whether the image is captured by reflection or transmission on the receiver. In some mountings it is not possible to capture the image perpendicular to the receivers so that the application makes an adjustment of perspective to get the focus to its original shape. The focus image provides detail information about the uniformity of focus through the surface map, which is a 3D image representation but it is unwieldy. A simple and useful representation is provided by so called “intensity profiles”. The intensity profile or irradiance distribution which represents the distribution of light to each distance to the center. The accumulated profile or accumulated irradiance that represents the cumulative light contained in relation also to the center. The representation of these profiles in the case of a linear and a circular concentrator are different due to their distinct geometry. While for a line focus profile is expressed in terms of semi-width of the receiver, for a circular concentrator is expressed in terms of radius. In either case provides information about the uniformity and size of focus needed to design the receiver. The objective of this project is the creation of a software application to perform processing and analysis of images obtained from light source of optical systems to characterize.The application has a simple and a intuitive interface so it can be used for any users. The resources required for the project are: a PC with Windows operating system, LabVIEW 8.6 Professional Edition and the modules NI Vision Development Module (for working with images) and NI Report Generation Toolkit (for reports and store application data .)
Resumo:
El peso específico de las Comunicaciones Ópticas dentro del ámbito de la Ingeniería de Telecomunicación no cesa de crecer. Sus aplicaciones, inicialmente dedicadas a las grandes líneas que enlazan las centrales de conmutación, alcanzan en la actualidad, como se ha mencionado, hasta los mismos hogares. Los progresos en este campo, con una sucesión sin tregua, no sólo se destinan a incrementar la capacidad de transmisión de los sistemas, sino a ampliar la diversidad de los procesos que sobre las señales se efectúan en el dominio óptico. Este dinamismo demanda a los profesionales del sector una revisión y actualización de sus conocimientos que les permitan resolver con soltura las cuestiones de su actividad de ingeniería. Por otra parte, durante los últimos años la importancia de las Comunicaciones Ópticas también se ha reflejado en las diferentes titulaciones de Ingenierías de Telecomunicación, cuyos planes de estudio contemplan esta materia tanto en asignaturas troncales como optativas. A menudo, las fuentes de información disponibles abordan esta disciplina con una orientación principalmente teórica. Profesionales y estudiantes de Ingeniería, pues, frente a esta materia se encuentran unos temas que tratan fenómenos físicos complejos, abundantes en conceptos abstractos y con un florido aparato matemático, pero muchas veces carentes de una visión práctica, importantísima en ingeniería, y que es, en definitiva, lo que se exige a alumnos e ingenieros: saber resolver problemas y cuestiones relacionados con las Comunicaciones Ópticas. Los sistemas de comunicaciones ópticas, y en especial aquellos que utilizan la fibra óptica como medio para la transmisión de información, como se ha dicho, están alcanzando un desarrollo importante en el campo de las telecomunicaciones. Las bondades que ofrece la fibra, de sobra conocidos y mencionados en el apartado que antecede (gran ancho de banda, inmunidad total a las perturbaciones de origen electromagnético, así como la no producción de interferencias, baja atenuación, etc.), han hecho que, hoy en día, sea uno de los campos de las llamadas tecnologías de la información y la comunicación que presente mayor interés por parte de científicos, ingenieros, operadores de telecomunicaciones y, por supuesto, usuarios. Ante esta realidad, el objetivo y justificación de la realización de este proyecto, por tanto, no es otro que el de acercar esta tecnología al futuro ingeniero de telecomunicaciones, y/o a cualquier persona con un mínimo de interés en este tema, y mostrarle de una forma práctica y visual los diferentes fenómenos que tienen lugar en la transmisión de información por medio de fibra óptica, así como los diferentes bloques y dispositivos en que se divide dicha comunicación. Para conseguir tal objetivo, el proyecto fin de carrera aquí presentado tiene como misión el desarrollo de una interfaz gráfica de usuario (GUI, del inglés Graphic User Interface) que permita a aquel que la utilice configurar de manera sencilla cada uno de los bloques en que se compone un enlace punto a punto de fibra óptica. Cada bloque en que se divide este enlace estará compuesto por varias opciones, que al elegir y configurar como se quiera, hará variar el comportamiento del sistema y presentará al usuario los diferentes fenómenos presentes en un sistema de comunicaciones ópticas, como son el ruido, la dispersión, la atenuación, etc., para una mejor comprensión e interiorización de la teoría estudiada. Por tanto, la aplicación, implementada en MATLAB, fruto de la realización de este PFC pretende servir de complemento práctico para las asignaturas dedicadas al estudio de las comunicaciones ópticas a estudiantes en un entorno amigable e intuitivo. Optical Communications in the field of Telecommunications Engineering continues to grow. Its applications, initially dedicated to large central lines that link the switching currently achieved, as mentioned, to the same household nowadays. Progress in this field, with a relentless succession, not only destined to increase the transmission capacity of the systems, but to broaden the diversity of the processes that are performed on the signals in the optical domain. This demands to professionals reviewing and updating their skills to enable them resolve issues easily. Moreover, in recent years the importance of optical communications is also reflected in the different degrees of Telecommunications Engineering, whose curriculum contemplates this area. Often, the information sources available to tackle this discipline mainly theoretical orientation. Engineering professionals and students are faced this matter are few topics discussing complex physical phenomena, and abstract concepts abundant with a flowery mathematical apparatus, but often wotput a practical, important in engineering, and that is what is required of students and engineers: knowing how to solve problems and issues related to optical communications. Optical communications systems, particularly those using optical fiber as a medium for transmission of information, as stated, are reaching a significant development in the field of telecommunications. The advantages offered by the fiber, well known and referred to in the preceding paragraph (high bandwidth, immunity to electromagnetic disturbances of origin and production of non interference, low attenuation, etc..), have made today, is one of the fields of information and communication technology that this increased interest by scientists, engineers, telecommunications operators and, of course, users. Given this reality, the purpose and justification of this project is not other than to bring this technology to the future telecommunications engineer, and / or anyone with a passing interest in this subject, and showing of a practical and various visual phenomena occurring in the transmission of information by optical fiber, as well as different blocks and devices in which said communication is divided. To achieve that objective, the final project presented here has as its mission the development of a graphical user interface (GUI) that allows the user to configure each of the blocks in which divided a point-to-point optical fiber. Each block into which this link will consist of several options to choose and configure it as you like, this will change the behavior of the system and will present to the user with the different phenomena occurring in an optical communication system, such as noise, dispersion, attenuation, etc., for better understanding and internalization of the theory studied. Therefore, the application, implemented in MATLAB, the result of the completion of the thesis is intended to complement practical subjects for the study of optical communications students in a friendly and intuitive environment.
Resumo:
This article describes a knowledge-based method for generating multimedia descriptions that summarize the behavior of dynamic systems. We designed this method for users who monitor the behavior of a dynamic system with the help of sensor networks and make decisions according to prefixed management goals. Our method generates presentations using different modes such as text in natural language, 2D graphics and 3D animations. The method uses a qualitative representation of the dynamic system based on hierarchies of components and causal influences. The method includes an abstraction generator that uses the system representation to find and aggregate relevant data at an appropriate level of abstraction. In addition, the method includes a hierarchical planner to generate a presentation using a model with dis- course patterns. Our method provides an efficient and flexible solution to generate concise and adapted multimedia presentations that summarize thousands of time series. It is general to be adapted to differ- ent dynamic systems with acceptable knowledge acquisition effort by reusing and adapting intuitive rep- resentations. We validated our method and evaluated its practical utility by developing several models for an application that worked in continuous real time operation for more than 1 year, summarizing sen- sor data of a national hydrologic information system in Spain.
Resumo:
This paper addresses the design of visual paradigms for observing the parallel execution of logic programs. First, an intuitive method is proposed for arriving at the design of a paradigm and its implementation as a tool for a given model of parallelism. This method is based on stepwise reñnement starting from the deñnition of basic notions such as events and observables and some precedence relationships among events which hold for the given model of parallelism. The method is then applied to several types of parallel execution models for logic programs (Orparallelism, Determinate Dependent And parallelism, Restricted and-parallelism) for which visualization paradigms are designed. Finally, VisAndOr, a tool which implements all of these paradigms is presented, together with a discussion of its usefulness through examples.
Resumo:
It is very often the case that programs require passing, maintaining, and updating some notion of state. Prolog programs often implement such stateful computations by carrying this state in predicate arguments (or, alternatively, in the internal datábase). This often causes code obfuscation, complicates code reuse, introduces dependencies on the data model, and is prone to incorrect propagation of the state information among predicate calis. To partly solve these problems, we introduce contexts as a consistent mechanism for specifying implicit arguments and its threading in clause goals. We propose a notation and an interpretation for contexts, ranging from single goals to complete programs, give an intuitive semantics, and describe a translation into standard Prolog. We also discuss a particular light-weight implementation in Ciao Prolog, and we show the usefulness of our proposals on a series of examples and applications, including code directiy using contexts, DCGs, extended DCGs, logical loops and other custom control structures.