982 resultados para Digital contents
Resumo:
Pós-graduação em Ciência da Computação - IBILCE
Resumo:
Pós-graduação em Televisão Digital: Informação e Conhecimento - FAAC
Resumo:
Pós-graduação em Linguística e Língua Portuguesa - FCLAR
Resumo:
Presents a developing laboratory within History of Culture, subject taught in the Library and Archive Studies at UNESP (Marilia). The project is based upon the foundations of research line Information and Technology: students from the second under graduation year participate in the improving of 27 entries in Portuguese language Wikipedia. The aim is to capacitate for scientific reading and writing in digital media, habilitate for the information identification and recuperation and for the interpretation and understanding of formal and contents aspects and its reorganization. It includes activities of search, selection, remix and republishing of texts, images, audio and videos in the convergence of diverse hypertext information sources, supported by tutors with strategic abilities in digital environments. In this sense it was adhered to the international Wikipedia Foundation University Campus Ambassadors project. It’s also aimed to induce sharing and collaboration behaviors with the purpose of creating necessary habits for the informational empowerment in Brazil. As methodology is to optimize the work of individuals already trained in wiki culture and to create in within the subject information sharing programs with a more specialized bias, giving greater credibility to the digital environment. The environment syntaxes helps in the learning of the complementary skills of reading and writing and offers itself as an open repository from which information can be reused. It is, thus, an empowerment strategy in the search of autonomy and self reliance considering intersemiotic knowledge in the edition, visualization and understanding of information in the social web. A second step of a verifying research on the environment’s credibility after the consolidation and dissemination of the entries improvement work is proposed.
Resumo:
Pós-graduação em Comunicação - FAAC
Resumo:
Pós-graduação em Televisão Digital: Informação e Conhecimento - FAAC
Resumo:
In this work, a Monte Carlo code was used to investigate the performance of different x-ray spectra in digital mammography, through a figure of merit (FOM), defined as FOM = CNR2/(D) over bar (g), with CNR being the contrast-to-noise ratio in image and (D) over bar (g) being the average glandular dose. The FOM was studied for breasts with different thicknesses t (2 cm <= t <= 8 cm) and glandular contents (25%, 50% and 75% glandularity). The anode/filter combinations evaluated were those traditionally employed in mammography (Mo/Mo, Mo/Rh, Rh/Rh), and a W anode combined with Al or K-edge filters (Zr, Mo, Rh, Pd, Ag, Cd, Sn), for tube potentials between 22 and 34 kVp. Results show that the W anode combined with K-edge filters provides higher values of FOM for all breast thicknesses investigated. Nevertheless, the most suitable filter and tube potential depend on the breast thickness, and for t >= 6 cm, they also depend on breast glandularity. Particularly for thick and dense breasts, a W anode combined with K-edge filters can greatly improve the digital technique, with the values of FOM up to 200% greater than that obtained with the anode/filter combinations and tube potentials traditionally employed in mammography. For breasts with t < 4 cm, a general good performance was obtained with the W anode combined with 60 mu m of the Mo filter at 24-25 kVp, while 60 mu m of the Pd filter provided a general good performance at 24-26 kVp for t = 4 cm, and at 28-30 and 29-31 kVp for t = 6 and 8 cm, respectively.
Resumo:
In vielen Industriezweigen, zum Beispiel in der Automobilindustrie, werden Digitale Versuchsmodelle (Digital MockUps) eingesetzt, um die Konstruktion und die Funktion eines Produkts am virtuellen Prototypen zu überprüfen. Ein Anwendungsfall ist dabei die Überprüfung von Sicherheitsabständen einzelner Bauteile, die sogenannte Abstandsanalyse. Ingenieure ermitteln dabei für bestimmte Bauteile, ob diese in ihrer Ruhelage sowie während einer Bewegung einen vorgegeben Sicherheitsabstand zu den umgebenden Bauteilen einhalten. Unterschreiten Bauteile den Sicherheitsabstand, so muss deren Form oder Lage verändert werden. Dazu ist es wichtig, die Bereiche der Bauteile, welche den Sicherhabstand verletzen, genau zu kennen. rnrnIn dieser Arbeit präsentieren wir eine Lösung zur Echtzeitberechnung aller den Sicherheitsabstand unterschreitenden Bereiche zwischen zwei geometrischen Objekten. Die Objekte sind dabei jeweils als Menge von Primitiven (z.B. Dreiecken) gegeben. Für jeden Zeitpunkt, in dem eine Transformation auf eines der Objekte angewendet wird, berechnen wir die Menge aller den Sicherheitsabstand unterschreitenden Primitive und bezeichnen diese als die Menge aller toleranzverletzenden Primitive. Wir präsentieren in dieser Arbeit eine ganzheitliche Lösung, welche sich in die folgenden drei großen Themengebiete unterteilen lässt.rnrnIm ersten Teil dieser Arbeit untersuchen wir Algorithmen, die für zwei Dreiecke überprüfen, ob diese toleranzverletzend sind. Hierfür präsentieren wir verschiedene Ansätze für Dreiecks-Dreiecks Toleranztests und zeigen, dass spezielle Toleranztests deutlich performanter sind als bisher verwendete Abstandsberechnungen. Im Fokus unserer Arbeit steht dabei die Entwicklung eines neuartigen Toleranztests, welcher im Dualraum arbeitet. In all unseren Benchmarks zur Berechnung aller toleranzverletzenden Primitive beweist sich unser Ansatz im dualen Raum immer als der Performanteste.rnrnDer zweite Teil dieser Arbeit befasst sich mit Datenstrukturen und Algorithmen zur Echtzeitberechnung aller toleranzverletzenden Primitive zwischen zwei geometrischen Objekten. Wir entwickeln eine kombinierte Datenstruktur, die sich aus einer flachen hierarchischen Datenstruktur und mehreren Uniform Grids zusammensetzt. Um effiziente Laufzeiten zu gewährleisten ist es vor allem wichtig, den geforderten Sicherheitsabstand sinnvoll im Design der Datenstrukturen und der Anfragealgorithmen zu beachten. Wir präsentieren hierzu Lösungen, die die Menge der zu testenden Paare von Primitiven schnell bestimmen. Darüber hinaus entwickeln wir Strategien, wie Primitive als toleranzverletzend erkannt werden können, ohne einen aufwändigen Primitiv-Primitiv Toleranztest zu berechnen. In unseren Benchmarks zeigen wir, dass wir mit unseren Lösungen in der Lage sind, in Echtzeit alle toleranzverletzenden Primitive zwischen zwei komplexen geometrischen Objekten, bestehend aus jeweils vielen hunderttausend Primitiven, zu berechnen. rnrnIm dritten Teil präsentieren wir eine neuartige, speicheroptimierte Datenstruktur zur Verwaltung der Zellinhalte der zuvor verwendeten Uniform Grids. Wir bezeichnen diese Datenstruktur als Shrubs. Bisherige Ansätze zur Speicheroptimierung von Uniform Grids beziehen sich vor allem auf Hashing Methoden. Diese reduzieren aber nicht den Speicherverbrauch der Zellinhalte. In unserem Anwendungsfall haben benachbarte Zellen oft ähnliche Inhalte. Unser Ansatz ist in der Lage, den Speicherbedarf der Zellinhalte eines Uniform Grids, basierend auf den redundanten Zellinhalten, verlustlos auf ein fünftel der bisherigen Größe zu komprimieren und zur Laufzeit zu dekomprimieren.rnrnAbschießend zeigen wir, wie unsere Lösung zur Berechnung aller toleranzverletzenden Primitive Anwendung in der Praxis finden kann. Neben der reinen Abstandsanalyse zeigen wir Anwendungen für verschiedene Problemstellungen der Pfadplanung.
Resumo:
National and international studies demonstrate that the number of teenagers using the inter-net increases. But even though they actually do have access from different places to the in-formation and communication pool of the internet, there is evidence that the ways in which teenagers use the net - regarding the scope and frequency in which services are used as well as the preferences for different contents of these services - differ significantly in relation to socio-economic status, education, and gender. The results of the regarding empirical studies may be summarised as such: teenager with low (formal ) education especially use internet services embracing 'entertainment, play and fun' while higher educated teenagers (also) prefer intellectually more demanding and particularly services supplying a greater variety of communicative and informative activities. More generally, pedagogical and sociological studies investigating "digital divide" in a dif-ferentiated and sophisticated way - i.e. not only in terms of differences between those who do have access to the Internet and those who do not - suggest that the internet is no space beyond 'social reality' (e.g. DiMaggio & Hargittai 2001, 2003; Vogelgesang, 2002; Welling, 2003). Different modes of utilisation, that structure the internet as a social space are primarily a specific contextualisation of the latter - and thus, the opportunities and constraints in virtual world of the internet are not less than those in the 'real world' related to unequal distribu-tions of material, social and cultural resources as well as social embeddings of the actors involved. This fact of inequality is also true regarding the outcomes of using the internet. Empirical and theoretical results concerning forms and processes of networking and commu-nity building - i.e. sociability in the internet, as well as the social embeddings of the users which are mediated through the internet - suggest that net based communication and infor-mation processes may entail the resource 'social support'. Thus, with reference to social work and the task of compensating the reproduction of social disadvantages - whether they are medial or not - the ways in which teenagers get access to and utilize net based social sup-port are to be analysed.
Resumo:
http://lib.dr.iastate.edu/carver_narratives/1023/thumbnail.jpg
Resumo:
The paper proposes a model for estimation of perceived video quality in IPTV, taking as input both video coding and network Quality of Service parameters. It includes some fitting parameters that depend mainly on the information contents of the video sequences. A method to derive them from the Spatial and Temporal Information contents of the sequences is proposed. The model may be used for near real-time monitoring of IPTV video quality.
Resumo:
This work describes the design and application of multimedia contents for web technologies-based training in minimally invasive surgery (MIS). The chosen strategy allows knowing the deficiencies of the current training methods so new multimedia contents can cover them. This study is concluded with the definition of three different types of multimedia contents accordingly to the development degree and didactic objectives that they present: Didactic resources are basic contents such as videos or documents that can be enhanced with contributions of users. On the other hand, case reports and didactic units have a defined structure. Didactic resources and case reports provide an informal training while didactic units are included in a more regulated training.
Resumo:
The type of signals obtained has conditioned chaos analysis tools. Almost in every case, they have analogue characteristics. But in certain cases, a chaotic digital signal is obtained and theses signals need a different approach than conventional analogue ones. The main objective of this paper will be to present some possible approaches to the study of this signals and how information about their characteristics may be obtained in the more straightforward possible way. We have obtained digital chaotic signals from an Optical Logic Cell with some feedback between output and one of the possible control gates. This chaos has been reported in several papers and its characteristics have been employed as a possible method to secure communications and as a way to encryption. In both cases, the influence of some perturbation in the transmission medium gave problems both for the synchronization of chaotic generators at emitter and receiver and for the recovering of information data. A proposed way to analyze the presence of some perturbation is to study the noise contents of transmitted signal and to implement a way to eliminate it. In our present case, the digital signal will be converted to a multilevel one by grouping bits in packets of 8 bits and applying conventional methods of time-frequency analysis to them. The results give information about the change in signals characteristics and hence some information about the noise or perturbations present. Equivalent representations to the phase and to the Feigenbaum diagrams for digital signals are employed in this case.
Resumo:
Este Proyecto Fin de Carrera pretende desarrollar una serie de unidades didácticas orientadas a mejorar el aprendizaje de la teoría de procesado digital de señales a través de la aplicación práctica. Con tal fin, se han diseñado una serie de prácticas que permitan al alumno alcanzar un apropiado nivel de conocimiento de la asignatura, la adquisición de competencias y alcanzar los resultados de aprendizaje previstos. Para desarrollar el proyecto primero se ha realizado una selección apropiada de los contenidos de la teoría de procesado digital de señales en relación con los resultados de aprendizaje esperados, seguidamente se han diseñado y validado unas prácticas basadas en un entorno de trabajo basado en MATLAB y DSP, y por último se ha redactado un manual de laboratorio que combina una parte teórica con su práctica correspondiente. El objetivo perseguido con la realización de estas prácticas es alcanzar un equilibrio teórico/práctico que permita sacar el máximo rendimiento de la asignatura desde el laboratorio, trabajando principalmente con el IDE Code Composer Studio junto con un kit de desarrollo basado en un DSP. ABSTRACT. This dissertation intends to develop some lessons oriented to improve about the digital signal processing theory. In order to get this objective some practices have been developed to allow to the students to achieve an appropriate level of knowledge of the subject, acquire skills and achieve the intended learning outcomes. To develop the project firstly it has been made an appropriate selection of the contents of the digital signal processing theory related with the expected results. After that, five practices based in a work environment based on Matlab and DSP have been designed and validated, and finally a laboratory manual has been drafted that combines the theoretical part with its corresponding practice. The objective with the implementation of these practices is to achieve a theoretical / practical balance to get the highest performance to the subject from the laboratory working mainly with the Code Composer Studio IDE together a development kit based on DSP.
Resumo:
Este estudio analiza la importancia de las TIC al servicio de las bibliotecas en general y, en concreto, el potencial de las bibliotecas digitales. En este trabajo se reflexiona sobre la importancia de las bibliotecas digitales no sólo como repositorios de contenidos, sino también como centros de creación de conocimiento. Se reflexiona sobre la importancia de la constitución de bibliotecas digitales especializadas en áreas culturales relacionadas y sobre el hecho de que sean multilingües, a fin de preservar los contenidos en sus lenguas originales, al tiempo que debe trabajarse con la traducción multilingüe (de y a muchas lenguas) como herramienta fundamental para la mejora de la difusión y conocimiento del patrimonio que se contiene en tales bibliotecas. En este sentido se explican las características de la Biblioteca Digital Plurilingüe del Mediterráneo-IVITRA.