888 resultados para Mind


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Maurice Merleau-Ponty (1908-1961) has been known as the philosopher of painting. His interest in the theory of perception intertwined with the questions concerning the artist s perception, the experience of an artwork and the possible interpretations of the artwork. For him, aesthetics was not a sub-field of philosophy, and art was not simply a subject matter for the aesthetic experience, but a form of thinking. This study proposes an opening for a dialogue between Merleau-Pontian phenomenology and contemporary art. The thesis examines his phenomenology through certain works of contemporary art and presents readings of these artworks through his phenomenology. The thesis both shows the potentiality of a method, but also engages in the critical task of finding the possible limitations of his approach. The first part lays out the methodological and conceptual points of departure of Merleau-Ponty s phenomenological approach to perception as well as the features that determined his discussion on encountering art. Merleau-Ponty referred to the experience of perceiving art using the notion of seeing with (voir selon). He stressed a correlative reciprocity described in Eye and Mind (1961) as the switching of the roles of the visible and the painter. The choice of artworks is motivated by certain restrictions in the phenomenological readings of visual arts. The examined works include paintings by Tiina Mielonen, a photographic work by Christian Mayer, a film by Douglas Gordon and Philippe Parreno, and an installation by Monika Sosnowska. These works resonate with, and challenge, his phenomenological approach. The chapters with case studies take up different themes that are central to Merleau-Ponty s phenomenology: space, movement, time, and touch. All of the themes are interlinked with the examined artworks. There are also topics that reappear in the thesis, such as the notion of écart and the question of encountering the other. As Merleau-Ponty argued, the sphere of art has a particular capability to address our being in the world. The thesis presents an interpretation that emphasises the notion of écart, which refers to an experience of divergence or dispossession. The sudden dissociation, surprise or rupture that is needed in order for a meeting between the spectator and the artwork, or between two persons, to be possible. Further, the thesis suggests that through artworks it is possible to take into consideration the écart, the divergence, that defines our subjectivity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this article we describe and demonstrate the versatility of a computer program, GENOME MAPPING, that uses interactive graphics and runs on an IRIS workstation. The program helps to visualize as well as analyse global and local patterns of genomic DNA sequences. It was developed keeping in mind the requirements of the human genome sequencing programme, which requires rapid analysis of the data. Using GENOME MAPPING one can discern signature patterns of different kinds of sequences and analyse such patterns for repetitive as well as rare sequence strings. Further, one can visualize the extent of global homology between different genomic sequences. An application of our method to the published yeast mitochondrial genome data shows similar sequence organizations in the entire sequence and in smaller subsequences.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is a need to understand the carbon (C) sequestration potential of the forestry option and its financial implications for each country.In India the C emissions from deforestation are estimated to be nearly offset by C sequestration in forests under succession and tree plantations. India has nearly succeeded in stabilizing the area under forests and has adequate forest conservation strategies. Biomass demands for softwood, hardwood and firewood are estimated to double or treble by the year 2020. A set of forestry options were developed to meet the projected biomass needs, and keeping in mind the features of land categories available, three scenarios were developed: potential; demand-driven; and programme-driven scenarios. Adoption of the demand-driven scenario, targeted at meeting the projected biomass needs, is estimated to sequester 78 Mt of C annually after accounting for all emissions resulting from clearfelling and end use of biomass. The demand-driven scenario is estimated to offset 50% of national C emission at 1990 level. The cost per t of C sequestered for forestry options is lower than the energy options considered. The annual investment required for implementing the demand-driven scenario is estimated to be US$ 2.1 billion for six years and is shown to be feasible. Among forestry options, the ranking based on investment cost per t of C sequestered from least cost to highest cost is; natural regeneration-agro-forestry-enhanced natural regeneration (< US$ 2.5/t C)-timber-community-softwood forestry (US$ 3.3 to 7.3 per t of C).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An attempt is made to present some challenging problems (mainly to the technically minded researchers) in the development of computational models for certain (visual) processes which are executed with, apparently, deceptive ease by the human visual system. However, in the interest of simplicity (and with a nonmathematical audience in mind), the presentation is almost completely devoid of mathematical formalism. Some of the findings in biological vision are presented in order to provoke some approaches to their computational models, The development of ideas is not complete, and the vast literature on biological and computational vision cannot be reviewed here. A related but rather specific aspect of computational vision (namely, detection of edges) has been discussed by Zucker, who brings out some of the difficulties experienced in the classical approaches.Space limitations here preclude any detailed analysis of even the elementary aspects of information processing in biological vision, However, the main purpose of the present paper is to highlight some of the fascinating problems in the frontier area of modelling mathematically the human vision system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

THE study of swirling boundary layers is of considerable importance in many rotodynamic machines such as rockets, jet engines, swirl generators, swirl atomizers, arc heaters, etc. For example, the introduction of swirl in a flow acceleration device such as a nozzle in a rocket engine promises efficient mass flow control. In nuclear rockets, swirl is used to retain the uranium atoms in the rocket chamber. With these applications in mind, Back1 and Muthanna and Nath2 have obtained the similarity solutions for a low-speed three-dimensional steady laminar compressible boundary layer with swirl inside an axisymmetric surface of variable cross section. The aim of the present analysis is to study the effect of massive blowing rates on the unsteady laminar swirling compressible boundary-layer flow of an axisymmetric body of arbitrary cross section when the freestream velocity and blowing rate vary with time. The type of swirl considered here is that of a free vortex superimposed on the longitudinal flow of a compressible fluid with variable properties. The analysis is applicable to external flow over a body as well as internal flow along a surface. For the case of external flow, strong blowing can have significant use in cooling the surface of hypervelocity vehicles, particularly when ablation occurs under large aerodynamic or radiative heating, but there may not be such an important application of strong blowing in the case of internal flow. The governing partial differential equations have been solved numerically using an implicit finite difference scheme with a quasilinearization technique.3 High temperature gas effects, such as radiation, dissociation, and ionization, etc., are not investigated. The nomenclature is usually that of Ref. 4 and is listed in the full paper.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The problem of sensor-network-based distributed intrusion detection in the presence of clutter is considered. It is argued that sensing is best regarded as a local phenomenon in that only sensors in the immediate vicinity of an intruder are triggered. In such a setting, lack of knowledge of intruder location gives rise to correlated sensor readings. A signal-space view-point is introduced in which the noise-free sensor readings associated to intruder and clutter appear as surfaces f(s) and f(g) and the problem reduces to one of determining in distributed fashion, whether the current noisy sensor reading is best classified as intruder or clutter. Two approaches to distributed detection are pursued. In the first, a decision surface separating f(s) and f(g) is identified using Neyman-Pearson criteria. Thereafter, the individual sensor nodes interactively exchange bits to determine whether the sensor readings are on one side or the other of the decision surface. Bounds on the number of bits needed to be exchanged are derived, based on communication-complexity (CC) theory. A lower bound derived for the two-party average case CC of general functions is compared against the performance of a greedy algorithm. Extensions to the multi-party case is straightforward and is briefly discussed. The average case CC of the relevant greaterthan (CT) function is characterized within two bits. Under the second approach, each sensor node broadcasts a single bit arising from appropriate two-level quantization of its own sensor reading, keeping in mind the fusion rule to be subsequently applied at a local fusion center. The optimality of a threshold test as a quantization rule is proved under simplifying assumptions. Finally, results from a QualNet simulation of the algorithms are presented that include intruder tracking using a naive polynomial-regression algorithm. 2010 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We address the problem of allocating a single divisible good to a number of agents. The agents have concave valuation functions parameterized by a scalar type. The agents report only the type. The goal is to find allocatively efficient, strategy proof, nearly budget balanced mechanisms within the Groves class. Near budget balance is attained by returning as much of the received payments as rebates to agents. Two performance criteria are of interest: the maximum ratio of budget surplus to efficient surplus, and the expected budget surplus, within the class of linear rebate functions. The goal is to minimize them. Assuming that the valuation functions are known, we show that both problems reduce to convex optimization problems, where the convex constraint sets are characterized by a continuum of half-plane constraints parameterized by the vector of reported types. We then propose a randomized relaxation of these problems by sampling constraints. The relaxed problem is a linear programming problem (LP). We then identify the number of samples needed for ``near-feasibility'' of the relaxed constraint set. Under some conditions on the valuation function, we show that value of the approximate LP is close to the optimal value. Simulation results show significant improvements of our proposed method over the Vickrey-Clarke-Groves (VCG) mechanism without rebates. In the special case of indivisible goods, the mechanisms in this paper fall back to those proposed by Moulin, by Guo and Conitzer, and by Gujar and Narahari, without any need for randomization. Extension of the proposed mechanisms to situations when the valuation functions are not known to the central planner are also discussed. Note to Practitioners-Our results will be useful in all resource allocation problems that involve gathering of information privately held by strategic users, where the utilities are any concave function of the allocations, and where the resource planner is not interested in maximizing revenue, but in efficient sharing of the resource. Such situations arise quite often in fair sharing of internet resources, fair sharing of funds across departments within the same parent organization, auctioning of public goods, etc. We study methods to achieve near budget balance by first collecting payments according to the celebrated VCG mechanism, and then returning as much of the collected money as rebates. Our focus on linear rebate functions allows for easy implementation. The resulting convex optimization problem is solved via relaxation to a randomized linear programming problem, for which several efficient solvers exist. This relaxation is enabled by constraint sampling. Keeping practitioners in mind, we identify the number of samples that assures a desired level of ``near-feasibility'' with the desired confidence level. Our methodology will occasionally require subsidy from outside the system. We however demonstrate via simulation that, if the mechanism is repeated several times over independent instances, then past surplus can support the subsidy requirements. We also extend our results to situations where the strategic users' utility functions are not known to the allocating entity, a common situation in the context of internet users and other problems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fully structured and matured open source spatial and temporal analysis technology seems to be the official carrier of the future for planning of the natural resources especially in the developing nations. This technology has gained enormous momentum because of technical superiority, affordability and ability to join expertise from all sections of the society. Sustainable development of a region depends on the integrated planning approaches adopted in decision making which requires timely and accurate spatial data. With the increased developmental programmes, the need for appropriate decision support system has increased in order to analyse and visualise the decisions associated with spatial and temporal aspects of natural resources. In this regard Geographic Information System (GIS) along with remote sensing data support the applications that involve spatial and temporal analysis on digital thematic maps and the remotely sensed images. Open source GIS would help in wide scale applications involving decisions at various hierarchical levels (for example from village panchayat to planning commission) on economic viability, social acceptance apart from technical feasibility. GRASS (Geographic Resources Analysis Support System, http://wgbis.ces.iisc.ernet.in/grass) is an open source GIS that works on Linux platform (freeware), but most of the applications are in command line argument, necessitating a user friendly and cost effective graphical user interface (GUI). Keeping these aspects in mind, Geographic Resources Decision Support System (GRDSS) has been developed with functionality such as raster, topological vector, image processing, statistical analysis, geographical analysis, graphics production, etc. This operates through a GUI developed in Tcltk (Tool command language / Tool kit) under Linux as well as with a shell in X-Windows. GRDSS include options such as Import /Export of different data formats, Display, Digital Image processing, Map editing, Raster Analysis, Vector Analysis, Point Analysis, Spatial Query, which are required for regional planning such as watershed Analysis, Landscape Analysis etc. This is customised to Indian context with an option to extract individual band from the IRS (Indian Remote Sensing Satellites) data, which is in BIL (Band Interleaved by Lines) format. The integration of PostgreSQL (a freeware) in GRDSS aids as an efficient database management system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents studies on the use of carbon nanotubes dispersed in an insulating fluid to serve as an automaton for healing open-circuit interconnect faults in integrated circuits. The physics behind the repair mechanism is the electric-field-induced diffusion limited aggregation. On the occurrence of an open fault, the repair is automatically triggered due to the presence of an electric field across the gap. We perform studies on the repair time as a function of the electric field and dispersion concentrations with the above application in mind.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We discuss the size-dependent density of nanoparticles and nanostructured materials keeping the recent experimental results in mind. The density is predicted to increase with decreasing size for nanoparticles but it can decrease with size for nanostructured materials that corroborates the experimental results reported in the literature. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Memory models for shared-memory concurrent programming languages typically guarantee sequential consistency (SC) semantics for datarace-free (DRF) programs, while providing very weak or no guarantees for non-DRF programs. In effect programmers are expected to write only DRF programs, which are then executed with SC semantics. With this in mind, we propose a novel scalable solution for dataflow analysis of concurrent programs, which is proved to be sound for DRF programs with SC semantics. We use the synchronization structure of the program to propagate dataflow information among threads without requiring to consider all interleavings explicitly. Given a dataflow analysis that is sound for sequential programs and meets certain criteria, our technique automatically converts it to an analysis for concurrent programs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The characterization of a closed-cell aluminum foam with the trade name Alporas is carried out here under compression loading for a nominal cross-head speed of 1 mm/min. Foam samples in the form of cubes are tested in a UTM and the average stress-strain behavior is obtained which clearly displays a plateau strength of approximately 2 MPa. It is noted that the specific energy absorption capacity of the foam can be high despite its low strength which makes it attractive as a material for certain energy-absorbing countermeasures. The mechanical behavior of the present Alporas foam is simulated using cellular (i.e. so-called microstructure-based) and solid element-based finite element models. The efficacy of the cellular approach is shown, perhaps for the first time in published literature, in terms of prediction of both stress-strain response and inclined fold formation during axial crush under compression loading. Keeping in mind future applications under impact loads, limited results are presented when foam samples are subjected to low velocity impact in a drop-weight test set-up.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With ever increasing demand for electric energy, additional generation and associated transmission facilities has to be planned and executed. In order to augment existing transmission facilities, proper planning and selective decisions are to be made whereas keeping in mind the interests of several parties who are directly or indirectly involved. Common trend is to plan optimal generation expansion over the planning period in order to meet the projected demand with minimum cost capacity addition along with a pre-specified reliability margin. Generation expansion at certain locations need new transmission network which involves serious problems such as getting right of way, environmental clearance etc. In this study, an approach to the citing of additional generation facilities in a given system with minimum or no expansion in the transmission facility is attempted using the network connectivity and the concept of electrical distance for projected load demand. The proposed approach is suitable for large interconnected systems with multiple utilities. Sample illustration on real life system is presented in order to show how this approach improves the overall performance on the operation of the system with specified performance parameters.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A study of the history and philosophy of the contribution of India towards the exploration of space since antiquity provides interesting insights. The contributions are described during the three periods namely: (1) the ten millenniums from 10,000 BC with a twilight period up to 900 AD; (2) the ten centuries from 900 AD to 1900 AD; and (3) the ten decades from 1900 AD to 2000 AD; called mythological, medieval, and modern respectively. Some important events during the above periods provide a reference view of the progress. The Vedas during the mythological period and the Siddhantas during the medieval periods, which are based on astronomical observations, indicate that the Indian contribution preceded other cultures. But most Western historians ignore this fact time and again in spite of many proofs provided to the contrary. This chapter also shows that Indians had the proper scientific attitude of developing any physical theory through the triplet of mind, model, and measurements. It is this same triplet that forms the basis of the present day well known Kalman filter technique. Up to about 1500 BC the Indian contribution was leading but during foreign invasion and occupation it lagged and has been improving only after independence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A substantial number of medical students in India have to bear an enormous financial burden for earning a bachelor's degree in medicine referred to as MBBS (bachelor of medicine and bachelor of surgery). This degree program lasts for four and one-half years followed by one year of internship. A postgraduate degree, such as MD, has to be pursued separately on completion of a MBBS. Every medical college in India is part of a hospital where the medical students get clinical exposure during the course of their study. All or at least a number of medical colleges in a given state are affiliated to a university that mainly plays a role of an overseeing authority. The medical colleges usually have no official interaction with other disciplines of education such as science and engineering, perhaps because of their independent location and absence of emphasis on medical research. However, many of the medical colleges are adept in imparting high-quality and sound training in medical practices including diagnostics and treatment. The medical colleges in India are generally of two types, i.e., government owned and private. Since only a limited number of seats are available across India in the former category of colleges, only a small fraction of aspiring candidates can find admission in these colleges after performing competitively in the relevant entrance tests. A major advantage of studying in these colleges is the nominal tuition fees that have to be paid. On the other hand, a large majority of would-be medical graduates have to seek admission in the privately run medical institutes in which the tuition and other related fees can be mind boggling when compared to their public counterparts. Except for candidates of exceptionally affluent background, the only alternative for fulfilling the dream of becoming a doctor is by financing one's study through hefty bank loans that may take years to pay back. It is often heard from patients that they are asked by doctors to undergo a plethora of diagnostic tests for apparently minor illnesses, which may financially benefit those prescribing the tests. The present paper attempts to throw light on the extent of disparity in cost of a medical education between state-funded and privately managed medical colleges in India; the average salary of a new medical graduate, which is often ridiculously low when compared to what is offered in entry-level engineering and business jobs; and the possible repercussions of this apparently unjust economic situation regarding the exploitation of patients.