970 resultados para Multimedia Digital Libraries
Resumo:
Los requisitos básicos del futuro MM distribuido -ancho de banda,velocidad, flexibilidad e interactividad- necesitan el desarrollo de tecnologías como la fibra óptica, JDS (Jerarquía Digital Síncrona) y ATM (Modo de Transferencia Asíncrona).
Resumo:
El Hogar Digital Accesible (HDA) de la ETSIST nace con el propósito de acercar las nuevas Tecnologías de la Información a las personas que precisan de necesidades concretas de accesibilidad y usabilidad, dotándoles de herramientas que les permitan aumentar su calidad de vida, confort, seguridad y autonomía. El entorno del HDA consta de elementos de control para puertas, persianas, iluminación, agua o gas, sensores de temperatura, incendios, gas, sistemas de climatización, sistemas de entretenimiento y sistemas de seguridad tales como detectores de presencia y alarmas. Todo ello apoyado sobre una arquitectura de red que proporciona una pasarela residencial y un acceso a banda ancha. El objetivo principal de este PFG ha sido el desarrollo de un sistema de autenticación para el Hogar Digital Accesible de bajo coste. La idea de integrar un sistema de autenticación en el HDA, surge de la necesidad de proteger de accesos no deseados determinados servicios disponibles dentro de un ámbito privado. Algunos de estos servicios pueden ser tales como el acceso a la lectura de los mensajes disponibles en el contestador automático, el uso de equipos multimedia, la desconexión de alarmas de seguridad o simplemente la configuración de ambientes según el usuario que esté autenticado (intensidad de luz, temperatura de la sala, etc.). En el desarrollo han primado los principios de accesibilidad, usabilidad y seguridad necesarios para la creación de un entorno no invasivo, que permitiera acreditar la identidad del usuario frente al sistema HDA. Se ha planteado como posible solución, un sistema basado en el reconocimiento de un trazo realizado por el usuario. Este trazo se usará como clave de cara a validar a los usuarios. El usuario deberá repetir el trazado que registró en el sistema para autenticarse. Durante la ejecución del presente PFG, se justificará la elección de este mecanismo de autenticación frente a otras alternativas disponibles en el mercado. Para probar la aplicación, se ha podido contar con dos periféricos de distintas gamas, el uDraw creado para la PS3 que se compone de una tableta digitalizadora y un lápiz que permite recoger los trazos realizados por el usuario de forma inalámbrica y la tableta digitalizadora Bamboo de Wacom. La herramienta desarrollada permite a su vez, la posibilidad de ser usada por otro tipo de dispositivos como es el caso del reloj con acelerómetro de 3 ejes de Texas Instruments Chronos eZ430 capaz de trasladar los movimientos del usuario al puntero de un ratón. El PFG se encuentra dividido en tres grandes bloques de flujo de trabajo. El primero se centra en el análisis del sistema y las tecnologías que lo componen, incluyendo los distintos algoritmos disponibles para realizar la autenticación basada en reconocimiento de patrones aplicados a imágenes que mejor se adaptan a las necesidades del usuario. En el segundo bloque se recoge una versión de prueba basada en el análisis y el diseño UML realizado previamente, sobre la que se efectuaron pruebas de concepto y se comprobó la viabilidad del proyecto. El último bloque incluye la verificación y validación del sistema mediante pruebas que certifican que se han alcanzado los niveles de calidad necesarios para la consecución de los objetivos planteados, generando finalmente la documentación necesaria. Como resultado del trabajo realizado, se ha obtenido un sistema que plantea una arquitectura fácilmente ampliable lograda a través del uso de técnicas como la introspección, que permiten separar la lógica de la capa de negocio del código que la implementa, pudiendo de forma simple e intuitiva sustituir código mediante ficheros de configuración, lo que hace que el sistema sea flexible y escalable. Tras la realización del PFG, se puede concluir que el producto final obtenido ha respondido de forma satisfactoria alcanzando los niveles de calidad requeridos, siendo capaz de proporcionar un sistema de autenticación alternativo a los convencionales, manteniendo unas cotas de seguridad elevadas y haciendo de la accesibilidad y el precio sus características más reseñables. ABSTRACT. Accessible Digital Home (HDA) of the ETSIST was created with the aim of bringing the latest information and communications technologies closer to the people who has special needs of accessibility and usability increasing their quality of life, comfort, security and autonomy. The HDA environment has different control elements for doors, blinds, lighting, water or gas, temperature sensors, fire protection systems, gas flashover, air conditioning systems, entertainments systems and security systems such as intruders detectors and alarms. Everything supported by an architecture net which provides a broadband residential services gateway. The main goal of this PFG was the development of a low-cost authentication system for the Accessible Digital Home. The idea of integrating an authentication system on the HDA, stems from the need to safeguard certain private key network resources from unauthorized access. Some of said resources are the access to the answering machine messages, the use of multimedia devices, the alarms deactivation or the parameter settings for each environment as programmed by the authenticated user (light intensity, room temperature, etc.). During the development priority was given to concepts like accessibility, usability and security. All of them necessary to create a non invasive environment that allows the users to certify their identity. A system based on stroke pattern recognition, was considered as a possible solution. This stroke is used as a key to validate users. The user must repeat the stroke that was saved on the system to validate access. The selection of this authentication mechanism among the others available options will be justified during this PFG. Two peripherals with different ranges were used to test the application. One of them was uDraw design for the PS3. It is wireless and is formed by a pen and a drawing tablet that allow us to register the different strokes drawn by the user. The other one was the Wacom Bamboo tablet, that supports the same functionality but with better accuracy. The developed tool allows another kind of peripherals like the 3-axes accelerometer digital wristwatch Texas Instruments Chronos eZ430 capable of transfering user movements to the mouse cursor. The PFG is divided by three big blocks that represent different workflows. The first block is focused on the system analysis and the technologies related to it, including algorithms for image pattern recognition that fits the user's needs. The second block describes how the beta version was developed based on the UML analysis and design previously done. It was tested and the viability of the project was verified. The last block contains the system verification and validation. These processes certify that the requirements have been fulfilled as well as the quality levels needed to reach the planned goals. Finally all the documentation has been produced. As a result of the work, an expandable system has been created, due to the introspection that provides the opportunity to separate the business logic from the code that implements it. With this technique, the code could be replaced throughout configuration files which makes the system flexible and highly scalable. Once the PFG has finished, it must therefore be concluded that the final product has been a success and high levels of quality have been achieved. This authentication tool gives us a low-cost alternative to the conventional ones. The new authentication system remains security levels reasonably high giving particular emphasis to the accessibility and the price.
Resumo:
Quizás el Código Morse, inventado en 1838 para su uso en la telegrafía, es uno de los primeros ejemplos de la utilización práctica de la compresión de datos [1], donde las letras más comunes del alfabeto son codificadas con códigos más cortos que las demás. A partir de 1940 y tras el desarrollo de la teoría de la información y la creación de los primeros ordenadores, la compresión de la información ha sido un reto constante y fundamental entre los campos de trabajo de investigadores de todo tipo. Cuanto mayor es nuestra comprensión sobre el significado de la información, mayor es nuestro éxito comprimiéndola. En el caso de la información multimedia, su naturaleza permite la compresión con pérdidas, alcanzando así cotas de compresión imposibles para los algoritmos sin pérdidas. Estos “recientes” algoritmos con pérdidas han estado mayoritariamente basados en transformación de la información al dominio de la frecuencia y en la eliminación de parte de la información en dicho dominio. Transformar al dominio de la frecuencia posee ventajas pero también involucra unos costes computacionales inevitables. Esta tesis presenta un nuevo algoritmo de compresión multimedia llamado “LHE” (Logarithmical Hopping Encoding) que no requiere transformación al dominio de la frecuencia, sino que trabaja en el dominio del espacio. Esto lo convierte en un algoritmo lineal de reducida complejidad computacional. Los resultados del algoritmo son prometedores, superando al estándar JPEG en calidad y velocidad. Para ello el algoritmo utiliza como base la respuesta fisiológica del ojo humano ante el estímulo luminoso. El ojo, al igual que el resto de los sentidos, responde al logaritmo de la señal de acuerdo a la ley de Weber. El algoritmo se compone de varias etapas. Una de ellas es la medición de la “Relevancia Perceptual”, una nueva métrica que nos va a permitir medir la relevancia que tiene la información en la mente del sujeto y en base a la misma, degradar en mayor o menor medida su contenido, a través de lo que he llamado “sub-muestreado elástico”. La etapa de sub-muestreado elástico constituye una nueva técnica sin precedentes en el tratamiento digital de imágenes. Permite tomar más o menos muestras en diferentes áreas de una imagen en función de su relevancia perceptual. En esta tesis se dan los primeros pasos para la elaboración de lo que puede llegar a ser un nuevo formato estándar de compresión multimedia (imagen, video y audio) libre de patentes y de alto rendimiento tanto en velocidad como en calidad. ABSTRACT The Morse code, invented in 1838 for use in telegraphy, is one of the first examples of the practical use of data compression [1], where the most common letters of the alphabet are coded shorter than the rest of codes. From 1940 and after the development of the theory of information and the creation of the first computers, compression of information has been a constant and fundamental challenge among any type of researchers. The greater our understanding of the meaning of information, the greater our success at compressing. In the case of multimedia information, its nature allows lossy compression, reaching impossible compression rates compared with lossless algorithms. These "recent" lossy algorithms have been mainly based on information transformation to frequency domain and elimination of some of the information in that domain. Transforming the frequency domain has advantages but also involves inevitable computational costs. This thesis introduces a new multimedia compression algorithm called "LHE" (logarithmical Hopping Encoding) that does not require transformation to frequency domain, but works in the space domain. This feature makes LHE a linear algorithm of reduced computational complexity. The results of the algorithm are promising, outperforming the JPEG standard in quality and speed. The basis of the algorithm is the physiological response of the human eye to the light stimulus. The eye, like other senses, responds to the logarithm of the signal according with Weber law. The algorithm consists of several stages. One is the measurement of "perceptual relevance," a new metric that will allow us to measure the relevance of information in the subject's mind and based on it; degrade accordingly their contents, through what I have called "elastic downsampling". Elastic downsampling stage is an unprecedented new technique in digital image processing. It lets take more or less samples in different areas of an image based on their perceptual relevance. This thesis introduces the first steps for the development of what may become a new standard multimedia compression format (image, video and audio) free of patents and high performance in both speed and quality.
Resumo:
En este proyecto, se pretende obtener la solución óptima para el control del hogar digital accesible. Para ello, comenzaremos explicando el funcionamiento básico de un sistema dómotico, enumeraremos los diversos dispositivos que se utilizan en este tipo de automatizaciones, y comentaremos las diferentes posibilidades con respecto a la arquitectura del sistema. Para elegir la opción más adecuada, se realizará un pequeño estudio a acerca de cada una de las tecnologías existentes, protocolos cerrados, y abiertos, así como tecnologías inalámbricas o de bus. Se realizará un estudio con mayor profundidad del estándar KNX, ya que será una de las tecnologías elegidas finalmente para la realización del proyecto. Una vez elegido el estándar, hemos de centrarnos en las necesidades del recinto, para así poder empezar a definir cada uno de los elementos que incluiremos en nuestra instalación, sensores, actuadores, elementos de intercomunicación, procesadores y dispositivos de control. El siguiente paso consistiría en la programación de la vivienda, para ello hemos de tener previamente estructurados y definidos tanto el número de circuitos eléctricos, como la función que estos desempeñan dentro del recinto inteligente, es decir, accionamiento, regulación etc, para así poder asignar cada circuito a la salida correspondiente de su propio actuador. La vivienda se programará a través de ETS, software asociado a la marca KNX. Mediante este protocolo controlaremos, iluminación, motores, climatización y seguridad. Debido a los recursos limitados que ofrece KNX con respecto a la programación lógica de eventos y secuencias de acciones, y la necesidad de visualizar la interfaz gráfica de la vivienda se ha integrado un procesador. Considerando el deseo de integrar el control de un televisor en la vivienda, futuras ampliaciones y otros aspectos, el procesador integrado será de Crestron Electronics, marca correspondiente a un protocolo cerrado de automatización de sistemas, que cuenta con grandes avances en el control multimedia. Por tanto, la segunda parte de la programación se realizará mediante otros dos softwares diferentes, pertenecientes a la marca, Simple Windows se encargará de la programación lógica del sistema, mientras que Vision Tools creará la visualización. Por último, obtendremos las conclusiones necesarias, adjuntaremos un diagrama de conexionado, presupuesto de la instalación, planos y un pequeño manual de usuario. ABSTRACT. The aim of this project is to optimize the environment control of the Accesible Digital Home unit located in ETSIST - UPM, through different essays, valuing the domestic possibilities and the current interfaces. The tests will be carried out comparing different protocols and the possibilities of optimization that they offer to a Digital Home. Aspects such as: ease the communications with other systems, reliability, costs, long term maintenance of the installation, etc. After conducting trials protocol or most appropriate technology for the automation of the enclosure shall be elected. One Chosen the standard, we have to focus on the needs of the enclosure, so, to begin defining each of the elements included in our installation, sensors, actuators, elements intercom, processors and control devices. The next step is the programing of housing, for that we have previously structured and defined both the number of electrical circuits, as the role they play in the intelligent enclosure, that is, switching, dimming etc., in order to assign each circuit to the corresponding output of its own actuator. The house will be scheduled through ETS, software associated with the brand KNX. Through this protocol we will control, lighting, motors, air conditioning and security. Due to the limited resources available in KNX with respect logic programming of events and sequences of actions, and the need to display the graphical interface housing has been integrated processor belonging to the closed protocol or Crestron electronics brand. Finally, when we get the necessary conclusions, enclose a diagram of wiring, installation budget, planes and a small manual.
Resumo:
A “Digital Divide” in information and technological literacy exists in Utah between small hospitals and clinics in rural areas and the larger health care institutions in the major urban area of the state. The goals of the outreach program of the Spencer S. Eccles Health Sciences Library at the University of Utah address solutions to this disparity in partnership with the National Network of Libraries of Medicine—Midcontinental Region, the Utah Department of Health, and the Utah Area Health Education Centers. In a circuit-rider approach, an outreach librarian offers classes and demonstrations throughout the state that teach information-access skills to health professionals. Provision of traditional library services to unaffiliated health professionals is integrated into the library's daily workload as a component of the outreach program. The paper describes the history, methodology, administration, funding, impact, and results of the program.
Resumo:
The main goal of this project was to develop an efficient methodology allowing rapid access to structurally diverse scaffolds decorated with various functional groups. Initially, we discovered and subsequently developed an experimentally straightforward, high-yielding photoinduced conversion of readily accessible diverse starting materials into polycyclic aldehydes and their (hemi)acetals decorated by various pendants. The two step sequence, involving the Diels-Alder addition of heterocyclic chalcones and other benzoyl ethylenes to a variety of dienes, followed by the Paternò-Büchi reaction, was described as an alkene-carbonyl oxametathesis. This methodology offers a rapid increase in molecular complexity and diversity of the target scaffolds. To develop this novel methodology further and explore its generality, we directed our attention to the Diels-Alder adducts based on various chromones. We discovered that the Diels-Alder adducts of chromones are capable of photoinduced alkene-arene [2+2] cycloaddition producing different dienes, which can either dimerize or be introduced into a double-tandem [4π+2π]·[2π+2π]·[4π+2π]·[2π+2π] synthetic sequence, followed by an acid-catalyzed oxametathesis, leading to a rapid expansion of molecular complexity over a few experimentally simple steps. In view of the fact that oxametathesis previously was primarily observed in aromatic oxetanes, we decided to prepare model aliphatic oxetanes with a conformationally unconstrained or "flexible" methyl group based on the Diels-Alder adducts of cyclohexadiene or cyclopentadiene with methyl vinyl ketone. Upon addition of an acid, the expected oxametathesis occurred with results similar to those observed in the aromatic series proving the generality of this approach. Also we synthesized polycyclic oxetanes resulting from the Diels-Alder adducts of cyclic ketones. This not only gave us access to remarkably strained oxetane systems, but also the mechanism for their protolytic ring opening provided a great deal of insight to how the strain affects the reactivity. Additionally, we discovered that although the model Hetero-Diels-Alder adducts did not undergo [2+2] cycloaddition, both exo- and endo-Sulfa-Diels-Alder products, nonetheless, were photochemically active and various products with defined stereochemistry could be produced upon photolysis. In conclusion, we have developed an approach to the encoding and screening of solution phase libraries based on the photorelease of externally sensitized photolabile tags. The encoding tags can be released into solution only when a binding event occurs between the ligand and the receptor, equipped with an electron transfer sensitizer. The released tags are analyzed in solution revealing the identity of the lead ligand or narrowing the range of potential leads.
Resumo:
A televisão nos dias atuais tem sofrido inúmeras inovações tecnológicas nos campos das transmissões multimídia, qualidade audio-visual e diversidade de funcionalidades. Entretanto, esta essencialmente mantêm sua característica de fornecer informações de forma quase que instantânea à população. O ambiente atual da televisão digital é caracterizado pela coexistência de inúmeros dispositivos capazes de oferecerem uma experiência televisa, associando-se computadores pessoais, smartphones, tablets e outros eletrônicos de consumo. Ainda, pode se incluir a este cenário a disponibilidade de inúmeras redes de transporte de dados tais como a radiodifusão, satélite, cabo e redes em banda larga. Este cenário diversificado, em termos de dispositivos e redes, é denominado de cenário de televisão digital híbrida, a qual destaca-se a interação do expectador com os diversos dispositivos. Estes cenários, por sua vez, motivam o desenvolvimento de tecnologias que permitem o aperfeiçoamento da pervasividade e dos meios pelos os quais os aplicativos possam ser suportados em diferentes plataformas. Este trabalho propõe ambientes interoperáveis envolvendo a televisão digital interativa e outros eletrônicos de consumo, aos quais foram realizados estudos e experimentos para se observar diferentes técnicas de sincronização e comunicação entre plataformas de interatividade para a televisão digital híbrida. Os resultados apontam para a possibilidade de cenários interoperáveis envolvendo o uso de marcadores e também recursos de redes e serviços TCP/IP, levando em consideração a eficiência e eficácia nos diferentes métodos. Conclui-se que os resultados odem motivar o desenvolvimento de cenários diferenciados envolvendo a televisão digital interativa e dispositivos de segunda tela, o que incrementa a interatividade e as formas de entretenimento.
Resumo:
The Colorado Alliance of Research Libraries has launched the Alliance Shared Print Trust and is in the process of developing a shared print analysis tool. The system allows libraries to compare themselves with other libraries that have added their MARC records so that they can easily and quickly determine what records are unique or held in common with other libraries. The comparison system is built on open source tools and has been embedded in the Gold Rush framework. The author provides a brief overview of other shared print analysis tools.
Resumo:
The paper describes a procedure for accurately and speedily calibrating tanks used for the chemical processing of nuclear materials. The procedure features the use of (1) precalibrated vessels certified to deliver known volumes of liquid, (2) calibrated linear measuring devices, and (3) a digital computer for manipulating data and producing printed calibration information. Calibration records of the standards are traceable to primary standards. Logic is incorporated in the computer program to accomplish curve fitting and perform the tests to accept or to reject the calibration, based on statistical, empirical, and report requirements. This logic is believed to be unique.
Resumo:
The international perspectives on these issues are especially valuable in an increasingly connected, but still institutionally and administratively diverse world. The research addressed in several chapters in this volume includes issues around technical standards bodies like EpiDoc and the TEI, engaging with ways these standards are implemented, documented, taught, used in the process of transcribing and annotating texts, and used to generate publications and as the basis for advanced textual or corpus research. Other chapters focus on various aspects of philological research and content creation, including collaborative or community driven efforts, and the issues surrounding editorial oversight, curation, maintenance and sustainability of these resources. Research into the ancient languages and linguistics, in particular Greek, and the language teaching that is a staple of our discipline, are also discussed in several chapters, in particular for ways in which advanced research methods can lead into language technologies and vice versa and ways in which the skills around teaching can be used for public engagement, and vice versa. A common thread through much of the volume is the importance of open access publication or open source development and distribution of texts, materials, tools and standards, both because of the public good provided by such models (circulating materials often already paid for out of the public purse), and the ability to reach non-standard audiences, those who cannot access rich university libraries or afford expensive print volumes. Linked Open Data is another technology that results in wide and free distribution of structured information both within and outside academic circles, and several chapters present academic work that includes ontologies and RDF, either as a direct research output or as essential part of the communication and knowledge representation. Several chapters focus not on the literary and philological side of classics, but on the study of cultural heritage, archaeology, and the material supports on which original textual and artistic material are engraved or otherwise inscribed, addressing both the capture and analysis of artefacts in both 2D and 3D, the representation of data through archaeological standards, and the importance of sharing information and expertise between the several domains both within and without academia that study, record and conserve ancient objects. Almost without exception, the authors reflect on the issues of interdisciplinarity and collaboration, the relationship between their research practice and teaching and/or communication with a wider public, and the importance of the role of the academic researcher in contemporary society and in the context of cutting edge technologies. How research is communicated in a world of instant- access blogging and 140-character micromessaging, and how our expectations of the media affect not only how we publish but how we conduct our research, are questions about which all scholars need to be aware and self-critical.
Resumo:
The consumption of academic journals has radically changed over the past decade, explains the author. While there has been an exponential rise in published scholarship, spiralling costs for commercial journals have caused cutbacks in subscriptions to academic journals by institutional libraries and raised calls for free online access to unpublished work that scholars have produced. The rise of the Internet has facilitated a concomitant growth in online scholarship. What, asks the author, are the promises on online scholarship?
Resumo:
Cultural institutions in the UK are repositories of a wealth of historical material. The scholarly importance of such resources is at the basis of the numerous digitisation projects aimed at widening their access worldwide. The lack of national policies has left those institutions alone in engaging in dissemination activities and in raising awareness of their own online material. Of particular interest to the author are the digital special collections hosted in the English Universities. The main activities of these institutions differ from others, such as museums, archives and public libraries, as they do not have the main institutional duties of preserving and the exploiting of their holdings. This article highlights related issues and suggests some of the possible measures to effectively promote and disseminate universities’ online digital special collections.
Resumo:
Cultural institutions in the UK are repositories of a wealth of historical material. The scholarly importance of such resources is at the basis of the numerous digitisation projects aimed at widening their access worldwide. The lack of national policies has left those institutions alone in engaging in dissemination activities and in raising awareness of their own online material. Of particular interest to the author are the digital special collections hosted in the English Universities. The main activities of these institutions differ from others, such as museums, archives and public libraries, as they do not have the main institutional duties of preserving and the exploiting of their holdings. This article highlights related issues and suggests some of the possible measures to effectively promote and disseminate universities’ online digital special collections.
Resumo:
This essay explores the relationship between the development of public libraries in the context of an increasingly market-dominated economy and marketised society. It argues that although neo-liberalism as a policy goal and practice has taken different forms over time, there are common themes in terms of its emphasis on market values, privatisation, and the support of measures that reduce the role of public funding and the state in the provision of public services. This has led some commentators to express concerns that the meaning and practice of citizenship and democracy is being transformed, managed or otherwise diminished. These concerns are compounded by changes effected by new digital technology. Imbricated with this issue are debates surrounding the future of the public library, and attempts by librarians and others to reinvent and reimagine its purpose. With reference to some innovative initiatives in the USA and Scandinavia, it is suggested that public libraries, through their service and spatial rearticulation, can conceivably help strengthen and revitalise public democracy and the public sphere.