921 resultados para Human information processing.
Resumo:
This thesis has discussed the development of a new metal ion doped panchromatic photopolymer for various holographic applications. High-quality panchromatic holographic recording material with high diffraction efficiency, high photosensitivity and high spatial resolution is one of the key factors for the successful recording of true colour holograms. The capability of the developed material for multicolour holography can be investigated.In the present work, multiplexing studies were carried out using He-Ne laser (632.8 nm). Multiplexing can be done using low wavelength lasers like Ar+ ion (488 nm) and frequency doubled Nd: YAG (532 nm) lasers, so as to increase the storage capacity. The photopolymer film studied had a thickness of only 130 Cm. Films with high thickness (~500 Cm) is highly essential for competitive holographic memories . Hence films with high thickness can be fabricated and efforts can be made to record more holograms or gratings in the material.In the present study, attempts were made to record data page in silver doped MBPVA/AA photopolymer film. Image of a checkerboard pattern was recorded in the film, which could be reconstructed with good image fidelity. Efforts can be made to determine the bit error rate (BER) which provides a quantitative measure of the image quality of the reconstructed image . Multiple holographic data pages can also be recorded in the material making use of different multiplexing techniques.Holographic optical elements (HOEs) are widely used in optical sensors, optical information processing, fibre optics, optical scanners and solar concentrators . The suitability of the developed film for recording holographic optical elements like lenses, beam splitters and filters can be studied.The suitability of a reflection hologram recorded in acrylamide based photopolymer for visual indication of environmental humidity is reported . Studies can be done to optimize the film composition for recording of reflection holograms.An improvement in the spatial resolution of PVA/acrylamide based photopolymer by using a low molecular-weight poly (vinyl alcohol) binder was recently reported . Effect of the molecular weight of the binder matrix on the holographic properties of the developed photopolymer system can be investigated.Incorporation of nanoparticles into photopolymer system is reported to enhance the resolution and improve the dimensional stability of the system . Hence efforts can be made to incorporate silver nanoparticles into the photopolymer and its influence on the holographic properties can be studied.This thesis was a small venture towards the realization of a big goal, a competent holographic recording material with excellent properties for practical holographic applications. As a result of the present research, we could successfully develop an efficient panchromatic photopolymer system and could demonstrate its suitability for recording transmission holograms and holographic data page. The developed photopolymer system is expected to have significant applications in the fields of true-color display holography, wavelength multiplexing holographic storage, and holographic optical elements. Highly concentrated and determined effort has yet to be put forth for this expectation to become a reality.
Resumo:
Thiosemicarbazones have recently attracted considerable attention due to their ability to form tridentate chelates with transition metal ions through either two nitrogen and sulfur atoms, N–N–S or oxygen, nitrogen and sulfur atoms, O–N–S. Considerable interest in thiosemicarbazones and their transition metal complexes has also grown in the areas of biology and chemistry due to biological activities such as antitumoral, fungicidal, bactericidal, antiviral and nonlinear optical properties. They have been used for metal analyses, for device applications related to telecommunications, optical computing, storage and information processing.The versatile applications of metal complexes of thiosemicarbazones in various fields prompted us to synthesize the tridentate NNS-donor thiosemicarbazones and their metal complexes. As a part of our studies on transition metal complexes with these ligands, the researcher undertook the current work with the following objectives. 1. To synthesize and physico-chemically characterize the following thiosemicarbazone ligands: a. Di-2-pyridyl ketone-N(4)-methyl thiosemicarbazone (HDpyMeTsc) b. Di-2-pyridyl ketone-N(4)-ethyl thiosemicarbazone (HDpyETsc) 2. To synthesize oxovanadium(IV), manganese(II), nickel(II), copper(II), zinc(II) and cadmium(II) complexes using the synthesized thiosemicarbazones as principal ligands and some anionic coligands. 3. To study the coordination modes of the ligands in metal complexes by using different physicochemical methods like partial elemental analysis, thermogravimetry and by different spectroscopic techniques. 4. To establish the structure of compounds by single crystal XRD studies
Resumo:
In this paper we describe the methodology and the structural design of a system that translates English into Malayalam using statistical models. A monolingual Malayalam corpus and a bilingual English/Malayalam corpus are the main resource in building this Statistical Machine Translator. Training strategy adopted has been enhanced by PoS tagging which helps to get rid of the insignificant alignments. Moreover, incorporating units like suffix separator and the stop word eliminator has proven to be effective in bringing about better training results. In the decoder, order conversion rules are applied to reduce the structural difference between the language pair. The quality of statistical outcome of the decoder is further improved by applying mending rules. Experiments conducted on a sample corpus have generated reasonably good Malayalam translations and the results are verified with F measure, BLEU and WER evaluation metrics
Resumo:
Clustering combined with multihop communication is a promising solution to cope with the energy requirements of large scale Wireless Sensor Networks. In this work, a new cluster based routing protocol referred to as Energy Aware Cluster-based Multihop (EACM) Routing Protocol is introduced, with multihop communication between cluster heads for transmitting messages to the base station and direct communication within clusters. We propose EACM with both static and dynamic clustering. The network is partitioned into near optimal load balanced clusters by using a voting technique, which ensures that the suitability of a node to become a cluster head is determined by all its neighbors. Results show that the new protocol performs better than LEACH on network lifetime and energy dissipation
Resumo:
The present study described about the interaction of a two level atom and squeezed field with time varying frequency. By applying a sinusoidal variation in the frequency of the field, the randomness in population inversion is reduced and the collapses and periodic revivals are regained. Quantum optics is an emerging field in physics which mainly deals with the interaction of atoms with quantised electromagnetic fields. Jaynes-Cummings Model (JCM) is a key model among them, which describes the interaction between a two level atom and a single mode radiation field. Here the study begins with a brief history of light, atom and their interactions. Also discussed the interaction between atoms and electromagnetic fields. The study suggest a method to manipulate the population inversion due to interaction and control the randomness in it, by applying a time dependence on the frequency of the interacting squeezed field.The change in behaviour of the population inversion due to the presence of a phase factor in the applied frequency variation is explained here.This study also describes the interaction between two level atom and electromagnetic field in nonlinear Kerr medium. It deals with atomic and field state evolution in a coupled cavity system. Our results suggest a new method to control and manipulate the population of states in two level atom radiation interaction,which is very essential for quantum information processing.We have also studied the variation of atomic population inversion with time, when a two level atom interacts with light field, where the light field has a sinusoidal frequency variation with a constant phase. In both coherent field and squeezed field cases, the population inversion variation is completely different from the phase zero frequency modulation case. It is observed that in the presence of a non zero phase φ, the population inversion oscillates sinusoidally.Also the collapses and revivals gradually disappears when φ increases from 0 to π/2. When φ = π/2 the evolution of population inversion is identical to the case when a two level atom interacts with a Fock state. Thus, by applying a phase shifted frequency modulation one can induce sinusoidal oscillations of atomic inversion in linear medium, those normally observed in Kerr medium. We noticed that the entanglement between the atom and field can be controlled by varying the period of the field frequency fluctuations. The system has been solved numerically and the behaviour of it for different initial conditions and different susceptibility values are analysed. It is observed that, for weak cavity coupling the effect of susceptibility is minimal. In cases of strong cavity coupling, susceptibility factor modifies the nature in which the probability oscillates with time. Effect of susceptibility on probability of states is closely related to the initial state of the system.
Resumo:
During recent years, quantum information processing and the study of N−qubit quantum systems have attracted a lot of interest, both in theory and experiment. Apart from the promise of performing efficient quantum information protocols, such as quantum key distribution, teleportation or quantum computation, however, these investigations also revealed a great deal of difficulties which still need to be resolved in practise. Quantum information protocols rely on the application of unitary and non–unitary quantum operations that act on a given set of quantum mechanical two-state systems (qubits) to form (entangled) states, in which the information is encoded. The overall system of qubits is often referred to as a quantum register. Today the entanglement in a quantum register is known as the key resource for many protocols of quantum computation and quantum information theory. However, despite the successful demonstration of several protocols, such as teleportation or quantum key distribution, there are still many open questions of how entanglement affects the efficiency of quantum algorithms or how it can be protected against noisy environments. To facilitate the simulation of such N−qubit quantum systems and the analysis of their entanglement properties, we have developed the Feynman program. The program package provides all necessary tools in order to define and to deal with quantum registers, quantum gates and quantum operations. Using an interactive and easily extendible design within the framework of the computer algebra system Maple, the Feynman program is a powerful toolbox not only for teaching the basic and more advanced concepts of quantum information but also for studying their physical realization in the future. To this end, the Feynman program implements a selection of algebraic separability criteria for bipartite and multipartite mixed states as well as the most frequently used entanglement measures from the literature. Additionally, the program supports the work with quantum operations and their associated (Jamiolkowski) dual states. Based on the implementation of several popular decoherence models, we provide tools especially for the quantitative analysis of quantum operations. As an application of the developed tools we further present two case studies in which the entanglement of two atomic processes is investigated. In particular, we have studied the change of the electron-ion spin entanglement in atomic photoionization and the photon-photon polarization entanglement in the two-photon decay of hydrogen. The results show that both processes are, in principle, suitable for the creation and control of entanglement. Apart from process-specific parameters like initial atom polarization, it is mainly the process geometry which offers a simple and effective instrument to adjust the final state entanglement. Finally, for the case of the two-photon decay of hydrogenlike systems, we study the difference between nonlocal quantum correlations, as given by the violation of the Bell inequality and the concurrence as a true entanglement measure.
Resumo:
In der psycholinguistischen Forschung ist die Annahme weitverbreitet, dass die Bewertung von Informationen hinsichtlich ihres Wahrheitsgehaltes oder ihrer Plausibilität (epistemische Validierung; Richter, Schroeder & Wöhrmann, 2009) ein strategischer, optionaler und dem Verstehen nachgeschalteter Prozess ist (z.B. Gilbert, 1991; Gilbert, Krull & Malone, 1990; Gilbert, Tafarodi & Malone, 1993; Herbert & Kübler, 2011). Eine zunehmende Anzahl an Studien stellt dieses Zwei-Stufen-Modell von Verstehen und Validieren jedoch direkt oder indirekt in Frage. Insbesondere Befunde zu Stroop-artigen Stimulus-Antwort-Kompatibilitätseffekten, die auftreten, wenn positive und negative Antworten orthogonal zum aufgaben-irrelevanten Wahrheitsgehalt von Sätzen abgegeben werden müssen (z.B. eine positive Antwort nach dem Lesen eines falschen Satzes oder eine negative Antwort nach dem Lesen eines wahren Satzes; epistemischer Stroop-Effekt, Richter et al., 2009), sprechen dafür, dass Leser/innen schon beim Verstehen eine nicht-strategische Überprüfung der Validität von Informationen vornehmen. Ausgehend von diesen Befunden war das Ziel dieser Dissertation eine weiterführende Überprüfung der Annahme, dass Verstehen einen nicht-strategischen, routinisierten, wissensbasierten Validierungsprozesses (epistemisches Monitoring; Richter et al., 2009) beinhaltet. Zu diesem Zweck wurden drei empirische Studien mit unterschiedlichen Schwerpunkten durchgeführt. Studie 1 diente der Untersuchung der Fragestellung, ob sich Belege für epistemisches Monitoring auch bei Informationen finden lassen, die nicht eindeutig wahr oder falsch, sondern lediglich mehr oder weniger plausibel sind. Mithilfe des epistemischen Stroop-Paradigmas von Richter et al. (2009) konnte ein Kompatibilitätseffekt von aufgaben-irrelevanter Plausibilität auf die Latenzen positiver und negativer Antworten in zwei unterschiedlichen experimentellen Aufgaben nachgewiesen werden, welcher dafür spricht, dass epistemisches Monitoring auch graduelle Unterschiede in der Übereinstimmung von Informationen mit dem Weltwissen berücksichtigt. Darüber hinaus belegen die Ergebnisse, dass der epistemische Stroop-Effekt tatsächlich auf Plausibilität und nicht etwa auf der unterschiedlichen Vorhersagbarkeit von plausiblen und unplausiblen Informationen beruht. Das Ziel von Studie 2 war die Prüfung der Hypothese, dass epistemisches Monitoring keinen evaluativen Mindset erfordert. Im Gegensatz zu den Befunden anderer Autoren (Wiswede, Koranyi, Müller, Langner, & Rothermund, 2013) zeigte sich in dieser Studie ein Kompatibilitätseffekt des aufgaben-irrelevanten Wahrheitsgehaltes auf die Antwortlatenzen in einer vollständig nicht-evaluativen Aufgabe. Die Ergebnisse legen nahe, dass epistemisches Monitoring nicht von einem evaluativen Mindset, möglicherweise aber von der Tiefe der Verarbeitung abhängig ist. Studie 3 beleuchtete das Verhältnis von Verstehen und Validieren anhand einer Untersuchung der Online-Effekte von Plausibilität und Vorhersagbarkeit auf Augenbewegungen beim Lesen kurzer Texte. Zusätzlich wurde die potentielle Modulierung dieser Effeke durch epistemische Marker, die die Sicherheit von Informationen anzeigen (z.B. sicherlich oder vielleicht), untersucht. Entsprechend der Annahme eines schnellen und nicht-strategischen epistemischen Monitoring-Prozesses zeigten sich interaktive Effekte von Plausibilität und dem Vorhandensein epistemischer Marker auf Indikatoren früher Verstehensprozesse. Dies spricht dafür, dass die kommunizierte Sicherheit von Informationen durch den Monitoring-Prozess berücksichtigt wird. Insgesamt sprechen die Befunde gegen eine Konzeptualisierung von Verstehen und Validieren als nicht-überlappenden Stufen der Informationsverarbeitung. Vielmehr scheint eine Bewertung des Wahrheitsgehalts oder der Plausibilität basierend auf dem Weltwissen – zumindest in gewissem Ausmaß – eine obligatorische und nicht-strategische Komponente des Sprachverstehens zu sein. Die Bedeutung der Befunde für aktuelle Modelle des Sprachverstehens und Empfehlungen für die weiterführende Forschung zum Vehältnis von Verstehen und Validieren werden aufgezeigt.
Resumo:
Since no physical system can ever be completely isolated from its environment, the study of open quantum systems is pivotal to reliably and accurately control complex quantum systems. In practice, reliability of the control field needs to be confirmed via certification of the target evolution while accuracy requires the derivation of high-fidelity control schemes in the presence of decoherence. In the first part of this thesis an algebraic framework is presented that allows to determine the minimal requirements on the unique characterisation of arbitrary unitary gates in open quantum systems, independent on the particular physical implementation of the employed quantum device. To this end, a set of theorems is devised that can be used to assess whether a given set of input states on a quantum channel is sufficient to judge whether a desired unitary gate is realised. This allows to determine the minimal input for such a task, which proves to be, quite remarkably, independent of system size. These results allow to elucidate the fundamental limits regarding certification and tomography of open quantum systems. The combination of these insights with state-of-the-art Monte Carlo process certification techniques permits a significant improvement of the scaling when certifying arbitrary unitary gates. This improvement is not only restricted to quantum information devices where the basic information carrier is the qubit but it also extends to systems where the fundamental informational entities can be of arbitary dimensionality, the so-called qudits. The second part of this thesis concerns the impact of these findings from the point of view of Optimal Control Theory (OCT). OCT for quantum systems utilises concepts from engineering such as feedback and optimisation to engineer constructive and destructive interferences in order to steer a physical process in a desired direction. It turns out that the aforementioned mathematical findings allow to deduce novel optimisation functionals that significantly reduce not only the required memory for numerical control algorithms but also the total CPU time required to obtain a certain fidelity for the optimised process. The thesis concludes by discussing two problems of fundamental interest in quantum information processing from the point of view of optimal control - the preparation of pure states and the implementation of unitary gates in open quantum systems. For both cases specific physical examples are considered: for the former the vibrational cooling of molecules via optical pumping and for the latter a superconducting phase qudit implementation. In particular, it is illustrated how features of the environment can be exploited to reach the desired targets.
Resumo:
The image comparison operation ??sessing how well one image matches another ??rms a critical component of many image analysis systems and models of human visual processing. Two norms used commonly for this purpose are L1 and L2, which are specific instances of the Minkowski metric. However, there is often not a principled reason for selecting one norm over the other. One way to address this problem is by examining whether one metric better captures the perceptual notion of image similarity than the other. With this goal, we examined perceptual preferences for images retrieved on the basis of the L1 versus the L2 norm. These images were either small fragments without recognizable content, or larger patterns with recognizable content created via vector quantization. In both conditions the subjects showed a consistent preference for images matched using the L1 metric. These results suggest that, in the domain of natural images of the kind we have used, the L1 metric may better capture human notions of image similarity.
Resumo:
Introducción: La desnutrición infantil es una importante preocupación en países en desarrollo y se relaciona con condiciones de pobreza. Metodología: Estudio secundario de una muestra de 1.232 datos de menores de cinco años con diagnóstico nutricional obtenido en forma retrospectiva en la evaluación del SISVAN año 2009 para Bogotá. Se utilizó para el procesamiento de la información Epi Info 6.04 y SPSS 17.0. Resultados: Se encontró que el 37.2% de los menores está en riesgo de desnutrición, el 27.3% tiene desnutrición aguda y el 7.2% desnutrición crónica. Fontibón y Chapinero presentan la mayor desnutrición aguda y crónica respectivamente. Los menores con reducidos ingresos familiares, de estrato uno, con madres que estudian y trabajan, divorciadas o viudas, o que sean desplazados actuales presentan mayor riesgo de desnutrición. La desnutrición aguda es mayor en los niños con desplazamientos mayores de un año o con esquemas de vacunación incompleto. Cuando se presentan inadecuadas condiciones de saneamiento, peso al nacer inferior a 2000 gramos, madres con escolaridad primaria o grupos etáreos entre 3 y 5 años se observa mayor desnutrición crónica. Quienes reciben lactancia materna exclusiva presentan menor desnutrición aguda y crónica. Conclusiones: En la población estudiada, el riesgo de desnutrición está por encima de la desnutrición aguda y crónica. Los resultados sugieren que la desnutrición y el riesgo de desnutrición pueden ser reducidos mejorando educación materna, saneamiento, prolongando la lactancia y cumpliendo esquemas de vacunación.
Resumo:
El presente estudio tiene por objetivo identificar cuáles son las estrategias de scanning que utilizan los responsables de la dirección y del diseño de la estrategia de las empresas. Se parte del hecho fundamental que las empresas están inmersas en un entorno que tiene como características fundamentales la incertidumbre en diferentes niveles y la turbulencia, que en esencia impiden predecir el resultado de los objetivos trazados desde la dirección. La muestra tomada está representada por 20 directivos de niveles uno, dos y tres de empresas de diferentes sectores de la economía, con sede en Bogotá, Colombia, a los cuáles se les preguntó a través de un instrumento por la frecuencia con que ejecutan actividades de scanning en diferentes dimensiones del entorno en que se desenvuelven sus organizaciones, las fuentes y herramientas de análisis y procesamiento de la información. Se pudo concluir que los directivos de la muestra utilizan las estrategias de scanning para explorar el entorno en gran medida y que están de acuerdo en que su percepción del nivel de incertidumbre existente en el entorno baja en la media que procesan y analizan la información. Igualmente se pudo corroborar de acuerdo con el marco teórico que los niveles de incertidumbre percibidos a nivel empresarial están por lo menos en un 80% entre el nivel dos que se define como futuros alternativos y el nivel tres caracterizado por un abanico de futuros.
Resumo:
Las tecnologías de la información han empezado a ser un factor importante a tener en cuenta en cada uno de los procesos que se llevan a cabo en la cadena de suministro. Su implementación y correcto uso otorgan a las empresas ventajas que favorecen el desempeño operacional a lo largo de la cadena. El desarrollo y aplicación de software han contribuido a la integración de los diferentes miembros de la cadena, de tal forma que desde los proveedores hasta el cliente final, perciben beneficios en las variables de desempeño operacional y nivel de satisfacción respectivamente. Por otra parte es importante considerar que su implementación no siempre presenta resultados positivos, por el contrario dicho proceso de implementación puede verse afectado seriamente por barreras que impiden maximizar los beneficios que otorgan las TIC.
Resumo:
A partir de un análisis realizado al discurso periodístico del diario El Espectador, en un periodo que abarca del 25 de agosto de 1983 al 2 de septiembre de 1989, del cual se recolectó un corpus textual de noticias, informes especiales, columnas de opinión y editoriales, en este este trabajo de grado se busca demostrar que este diario fue un actor tanto político como social en el conflicto entre las mafias del narcotráfico y el Estado colombiano. Se argumenta que las diferentes acciones realizadas por el periódico, en donde denunciaba y visibilizaba hechos coyunturales del momento, le permitieron convertirse en un grupo de interés que dispuso de toda su variedad y capacidad para influir y alterar el comportamiento de otros actores involucrados en la llamada „guerra contra las drogas‟, durante la década de los años ochenta en Colombia
Resumo:
La economía mundial ha presentado una tendencia hacia la globalización y la integración económica; este comportamiento ha influenciado significativamente la economía Colombiana en los últimos años, promoviendo los lazos comerciales con diferentes economías a nivel mundial; esto con el objetivo de fortalecer la actividad económica tanto interna como externamente, buscando nuevas oportunidades que le permitan un crecimiento económico perdurable, a través de la inversión extranjera directa, la investigación y desarrollo, innovación tecnológica, mano de obra calificada, entre otros. Continuando esta tendencia Colombia inició un proceso de negociación de un Tratado de Libre Comercio con la Unión Europea en el año 2012, el cual culminó con éxito firmándose el 31 de Julio del 2013, mediante el decreto 1636, donde Colombia y la Unión Europea por libre albedrio se comprometieron a cumplir con todos los puntos pactados dentro del tratado. Con el objetivo de generar un diagnóstico del sub-sector lechero, se analizó la situación actual de la economía Colombiana frente a la Unión Europea, vista desde la productividad y la competitividad que se presenta en cada una de estas economías, para determinar las oportunidades o las amenazas que podría representar el tratado de libre comercio con dicho grupo económico pero específicamente concentrándose en la industria procesadora de productos lácteos. Del diagnóstico comparativo se encontró que existen grandes asimetrías entre la industria procesadora de productos lácteos en Colombia frente a la misma de la Unión Europea (UE). Por último se realizó el análisis de los factores internos de las empresas a través de la Matriz de Evaluación de Factores Internos (MEFI), indicador que permite identificar la debilidad o fortaleza dentro de las empresas de acuerdo a la valoración de las características de cada una. Por otro lado, a través de la matriz DOFA se analizaron las debilidades, oportunidades, fortalezas y amenazas de las empresas; de acuerdo a los resultados de ambas matrices se plantearon recomendaciones que podrán ser aplicadas dentro de las empresas objeto de estudio.
Resumo:
Introducción: La inhalación de polvo de carbón favorece el desarrollo de neumoconiosis, causa daños irreversibles al pulmón que se identifican radiológicamente. Los síntomas son tardíos y la patología se puede desarrollar tras varios años de exposición. Objetivo: Caracterizar los hallazgos radiográficos según la metodología de la Organización Internacional del Trabajo (OIT) 2000 y relacionarlos con la sintomatología respiratoria en trabajadores expuestos a polvo de carbón en las labores de minería de socavón en el departamento de Boyacá, Colombia, 2015. Materiales y métodos: Estudio de corte transversal realizado en 232 mineros, se indagó características sociodemográficas, signos y síntomas del sistema respiratorio. Se realizó radiografías de tórax y se aplicó la metodología OIT para describir los hallazgos. Se estableció asociación estadística a través de la prueba Chi cuadrado de Pearson. Para el procesamiento de la información se utilizó el programa SPSS statistics 2.3 Resultados: Toda la población fue de sexo masculino, con una edad promedio de 40,8 años. El cargo de picador fue el más frecuente en 72,4% de los trabajadores. Las radiografías mostraron opacidades pequeñas redondeadas (q/q) en 42%. La expectoración fue el síntoma más frecuente 66,4%. Se encontró una relación estadísticamente significativa entre el tabaquismo con las anomalía parenquimatosas (p=0,002).Conclusión: La prevalencia de neumoconiosis para el departamento de Boyacá fue de 29.7% entre los trabajadores valorados según los criterios OIT para lectura de radiografías de tórax, por lo que se requiere poner en práctica medidas de control a fin de reducir al mínimo la exposición de los trabajadores.