808 resultados para networked digital technologies
Resumo:
This study examines the use of Cybercafés/Internet resources and the evaluation of their usefulness. About eight Cybercafés located in the university community were used in this study. Questionnaires, interviews with the Cybercafé owners, staff and users as well as personal observations made during inspection of these cafés were used in this study. The data were analysed according to the background of the Internet users. The richness and high speed, accuracy, and authority were used by users to judge the quality of the Internet. Information such as the establishment of the café's facilities, membership and the future of the Cybercafés were also looked into. Finally, one can clearly see that the dominating impact of digital technology has crossed the Rubicon of controversy. The result of the survey shows that forty percent of the users learnt to use the internet by self instruction, thirty five percent learnt from colleagues or friends. Those in the sciences use the internet the most, the channel mostly used in obtaining information is the search engines. A large number of students, faculties and researchers make use of the internet in obtaining information. Many of those of those users make use of the Cybercafés in the university community.
Resumo:
Pós-graduação em Psicologia - FCLAS
Resumo:
To date the hospital radiological workflow is completing a transition from analog to digital technology. Since the X-rays digital detection technologies have become mature, hospitals are trading on the natural devices turnover to replace the conventional screen film devices with digital ones. The transition process is complex and involves not just the equipment replacement but also new arrangements for image transmission, display (and reporting) and storage. This work is focused on 2D digital detector’s characterization with a concern to specific clinical application; the systems features linked to the image quality are analyzed to assess the clinical performances, the conversion efficiency, and the minimum dose necessary to get an acceptable image. The first section overviews the digital detector technologies focusing on the recent and promising technological developments. The second section contains a description of the characterization methods considered in this thesis categorized in physical, psychophysical and clinical; theory, models and procedures are described as well. The third section contains a set of characterizations performed on new equipments that appears to be some of the most advanced technologies available to date. The fourth section deals with some procedures and schemes employed for quality assurance programs.
Resumo:
This doctoral dissertation aims to establish fiber-optic technologies overcoming the limiting issues of data communications in indoor environments. Specific applications are broadband mobile distribution in different in-building scenarios and high-speed digital transmission over short-range wired optical systems. Two key enabling technologies are considered: Radio over Fiber (RoF) techniques over standard silica fibers for distributed antenna systems (DAS) and plastic optical fibers (POFs) for short-range communications. Hence, the objectives and achievements of this thesis are related to the application of RoF and POF technologies in different in-building scenarios. On one hand, a theoretical and experimental analysis combined with demonstration activities has been performed on cost-effective RoF systems. An extensive modeling on modal noise impact both on linear and non-linear characteristics of RoF link over silica multimode fiber has been performed to achieve link design rules for an optimum choice of the transmitter, receiver and launching technique. A successful transmission of Long Term Evolution (LTE) mobile signals on the resulting optimized RoF system over silica multimode fiber employing a Fabry-Perot LD, central launch technique and a photodiode with a built-in ball lens was demonstrated up to 525m with performances well compliant with standard requirements. On the other hand, digital signal processing techniques to overcome the bandwidth limitation of POF have been investigated. An uncoded net bit-rate of 5.15Gbit/s was obtained on a 50m long POF link employing an eye-safe transmitter, a silicon photodiode, and DMT modulation with bit and power loading algorithm. With the insertion of 3x2N quadrature amplitude modulation constellation formats, an uncoded net-bit-rate of 5.4Gbit/s was obtained on a 50 m long POF link employing an eye-safe transmitter and a silicon avalanche photodiode. Moreover, simultaneous transmission of baseband 2Gbit/s with DMT and 200Mbit/s with an ultra-wideband radio signal has been validated over a 50m long POF link.
Resumo:
Throughout the twentieth century statistical methods have increasingly become part of experimental research. In particular, statistics has made quantification processes meaningful in the soft sciences, which had traditionally relied on activities such as collecting and describing diversity rather than timing variation. The thesis explores this change in relation to agriculture and biology, focusing on analysis of variance and experimental design, the statistical methods developed by the mathematician and geneticist Ronald Aylmer Fisher during the 1920s. The role that Fisher’s methods acquired as tools of scientific research, side by side with the laboratory equipment and the field practices adopted by research workers, is here investigated bottom-up, beginning with the computing instruments and the information technologies that were the tools of the trade for statisticians. Four case studies show under several perspectives the interaction of statistics, computing and information technologies, giving on the one hand an overview of the main tools – mechanical calculators, statistical tables, punched and index cards, standardised forms, digital computers – adopted in the period, and on the other pointing out how these tools complemented each other and were instrumental for the development and dissemination of analysis of variance and experimental design. The period considered is the half-century from the early 1920s to the late 1960s, the institutions investigated are Rothamsted Experimental Station and the Galton Laboratory, and the statisticians examined are Ronald Fisher and Frank Yates.
Resumo:
This thesis is focused on Smart Grid applications in medium voltage distribution networks. For the development of new applications it appears useful the availability of simulation tools able to model dynamic behavior of both the power system and the communication network. Such a co-simulation environment would allow the assessment of the feasibility of using a given network technology to support communication-based Smart Grid control schemes on an existing segment of the electrical grid and to determine the range of control schemes that different communications technologies can support. For this reason, is presented a co-simulation platform that has been built by linking the Electromagnetic Transients Program Simulator (EMTP v3.0) with a Telecommunication Network Simulator (OPNET-Riverbed v18.0). The simulator is used to design and analyze a coordinate use of Distributed Energy Resources (DERs) for the voltage/var control (VVC) in distribution network. This thesis is focused control structure based on the use of phase measurement units (PMUs). In order to limit the required reinforcements of the communication infrastructures currently adopted by Distribution Network Operators (DNOs), the study is focused on leader-less MAS schemes that do not assign special coordinating rules to specific agents. Leader-less MAS are expected to produce more uniform communication traffic than centralized approaches that include a moderator agent. Moreover, leader-less MAS are expected to be less affected by limitations and constraint of some communication links. The developed co-simulator has allowed the definition of specific countermeasures against the limitations of the communication network, with particular reference to the latency and loss and information, for both the case of wired and wireless communication networks. Moreover, the co-simulation platform has bee also coupled with a mobility simulator in order to study specific countermeasures against the negative effects on the medium voltage/current distribution network caused by the concurrent connection of electric vehicles.
Resumo:
Electric power grids throughout the world suffer from serious inefficiencies associated with under-utilization due to demand patterns, engineering design and load following approaches in use today. These grids consume much of the world’s energy and represent a large carbon footprint. From material utilization perspectives significant hardware is manufactured and installed for this infrastructure often to be used at less than 20-40% of its operational capacity for most of its lifetime. These inefficiencies lead engineers to require additional grid support and conventional generation capacity additions when renewable technologies (such as solar and wind) and electric vehicles are to be added to the utility demand/supply mix. Using actual data from the PJM [PJM 2009] the work shows that consumer load management, real time price signals, sensors and intelligent demand/supply control offer a compelling path forward to increase the efficient utilization and carbon footprint reduction of the world’s grids. Underutilization factors from many distribution companies indicate that distribution feeders are often operated at only 70-80% of their peak capacity for a few hours per year, and on average are loaded to less than 30-40% of their capability. By creating strong societal connections between consumers and energy providers technology can radically change this situation. Intelligent deployment of smart sensors, smart electric vehicles, consumer-based load management technology very high saturations of intermittent renewable energy supplies can be effectively controlled and dispatched to increase the levels of utilization of existing utility distribution, substation, transmission, and generation equipment. The strengthening of these technology, society and consumer relationships requires rapid dissemination of knowledge (real time prices, costs & benefit sharing, demand response requirements) in order to incentivize behaviors that can increase the effective use of technological equipment that represents one of the largest capital assets modern society has created.
Resumo:
It is a central premise of the advertising campaigns for nearly all digital communication devices that buying them augments the user: they give us a larger, better memory; make us more “creative” and “productive”; and/or empower us to access whatever information we desire from wherever we happen to be. This study is about how recent popular cinema represents the failure of these technological devices to inspire the enchantment that they once did and opens the question of what is causing this failure. Using examples from the James Bond films, the essay analyzes the ways in which human users are frequently represented as the media connecting and augmenting digital devices and NOT the reverse. It makes use of the debates about the ways in which our subjectivity is itself a networked phenomenon and the extended mind debate from the philosophy of mind. It will prove (1) that this represents an important counter-narrative to the technophilic optimism about augmentation that pervades contemporary advertising, consumer culture, and educational debates; and (2) that this particular discourse of augmentation is really about technological advances and not advances in human capacity.
Resumo:
The single electron transistor (SET) is a Coulomb blockade device, whose operation is based on the controlled manipulation of individual electrons. Single electron transistors show immense potential to be used in future ultra lowpower devices, high density memory and also in high precision electrometry. Most SET devices operate at cryogenic temperatures, because the charging energy is much smaller than the thermal oscillations. The room temperature operation of these devices is possible with sub- 10nm nano-islands due to the inverse dependance of charging energy on the radius of the conducting nano-island. The fabrication of sub-10nm features with existing lithographic techniques is a technological challenge. Here we present the results for the first room temperature operating SET device fabricated using Focused Ion Beam deposition technology. The SET device, incorporates an array of tungsten nano-islands with an average diameter of 8nm. The SET devices shows clear Coulomb blockade for different gate voltages at room temperature. The charging energy of the device was calculated to be 160.0 meV; the capacitance per junction was found to be 0.94 atto F; and the tunnel resistance per junction was calculated to be 1.26 G Ω. The tunnel resistance is five orders of magnitude larger than the quantum of resistance (26 k Ω) and allows for the localization of electrons on the tungsten nano-island. The lower capacitance of the device combined with the high tunnel resistance, allows for the Coulomb blockade effects observed at room temperature. Different device configurations, minimizing the total capacitance of the device have been explored. The effect of the geometry of the nano electrodes on the device characteristics has been presented. Simulated device characteristics, based on the soliton model have been discussed. The first application of SET device as a gas sensor has been demonstrated.
Resumo:
In this project we developed conductive thermoplastic resins by adding varying amounts of three different carbon fillers: carbon black (CB), synthetic graphite (SG) and multi-walled carbon nanotubes (CNT) to a polypropylene matrix for application as fuel cell bipolar plates. This component of fuel cells provides mechanical support to the stack, circulates the gases that participate in the electrochemical reaction within the fuel cell and allows for removal of the excess heat from the system. The materials fabricated in this work were tested to determine their mechanical and thermal properties. These materials were produced by adding varying amounts of single carbon fillers to a polypropylene matrix (2.5 to 15 wt.% Ketjenblack EC-600 JD carbon black, 10 to 80 wt.% Asbury Carbon's Thermocarb TC-300 synthetic graphite, and 2.5 to 15 wt.% of Hyperion Catalysis International's FIBRILTM multi-walled carbon nanotubes) In addition, composite materials containing combinations of these three fillers were produced. The thermal conductivity results showed an increase in both through-plane and in-plane thermal conductivities, with the largest increase observed for synthetic graphite. The Department of Energy (DOE) had previously set a thermal conductivity goal of 20 W/m·K, which was surpassed by formulations containing 75 wt.% and 80 wt.% SG, yielding in-plane thermal conductivity values of 24.4 W/m·K and 33.6 W/m·K, respectively. In addition, composites containing 2.5 wt.% CB, 65 wt.% SG, and 6 wt.% CNT in PP had an in–plane thermal conductivity of 37 W/m·K. Flexural and tensile tests were conducted. All composite formulations exceeded the flexural strength target of 25 MPa set by DOE. The tensile and flexural modulus of the composites increased with higher concentration of carbon fillers. Carbon black and synthetic graphite caused a decrease in the tensile and flexural strengths of the composites. However, carbon nanotubes increased the composite tensile and flexural strengths. Mathematical models were applied to estimate through-plane and in-plane thermal conductivities of single and multiple filler formulations, and tensile modulus of single-filler formulations. For thermal conductivity, Nielsen's model yielded accurate thermal conductivity values when compared to experimental results obtained through the Flash method. For prediction of tensile modulus Nielsen's model yielded the smallest error between the predicted and experimental values. The second part of this project consisted of the development of a curriculum in Fuel Cell and Hydrogen Technologies to address different educational barriers identified by the Department of Energy. By the creation of new courses and enterprise programs in the areas of fuel cells and the use of hydrogen as an energy carrier, we introduced engineering students to the new technologies, policies and challenges present with this alternative energy. Feedback provided by students participating in these courses and enterprise programs indicate positive acceptance of the different educational tools. Results obtained from a survey applied to students after participating in these courses showed an increase in the knowledge and awareness of energy fundamentals, which indicates the modules developed in this project are effective in introducing students to alternative energy sources.
Resumo:
The Melungeons, a minority recognized in Southern Appalachia where they settled in the early 1800s, have mixed heritage—European, Mediterranean, Native American, and Sub-Saharan African. Their dark skin and distinctive features have marked them and been the cause of racial persecution both by custom and by law in Appalachia for two centuries. Their marginalization has led to an insider mentality, which I call a “literacy” of Melungeon-ness that affects every facet of their lives. Just a century ago, while specialized practices such as farming, preserving food, hunting, gathering, and distilling insured survival in the unforgiving mountain environment, few Melungeons could read or write. Required to pay property taxes and render military service, they were denied education, suffrage, and other legal rights. In the late 1890s visionary Melungeon leader Batey Collins invited Presbyterian homemissionaries to settle in one Tennessee Melungeon community where they established a church and built a school of unparalleled excellence. Educator-ministers Mary Rankin and Chester Leonard creatively reified the theories of Dewey, Montessori, and Rauschenbusch, but, despite their efforts, school literacy did not neutralize difference. Now, taking reading and writing for granted, Melungeons are exploring their identity by creating websites and participating in listserv discussions. These online expressions, which provide texts for rhetorical, semiotic, and socio-linguistic analysis, illustrate not solidarity but fragmentation on issues of origins and legitimacy. Armed with literacies of difference stemming from both nature and nurture, Melungeons are using literacy practices to embrace the difference they cannot escape.
Resumo:
There is a need by engine manufactures for computationally efficient and accurate predictive combustion modeling tools for integration in engine simulation software for the assessment of combustion system hardware designs and early development of engine calibrations. This thesis discusses the process for the development and validation of a combustion modeling tool for Gasoline Direct Injected Spark Ignited Engine with variable valve timing, lift and duration valvetrain hardware from experimental data. Data was correlated and regressed from accepted methods for calculating the turbulent flow and flame propagation characteristics for an internal combustion engine. A non-linear regression modeling method was utilized to develop a combustion model to determine the fuel mass burn rate at multiple points during the combustion process. The computational fluid dynamic software Converge ©, was used to simulate and correlate the 3-D combustion system, port and piston geometry to the turbulent flow development within the cylinder to properly predict the experimental data turbulent flow parameters through the intake, compression and expansion processes. The engine simulation software GT-Power © is then used to determine the 1-D flow characteristics of the engine hardware being tested to correlate the regressed combustion modeling tool to experimental data to determine accuracy. The results of the combustion modeling tool show accurate trends capturing the combustion sensitivities to turbulent flow, thermodynamic and internal residual effects with changes in intake and exhaust valve timing, lift and duration.
Resumo:
The persuasive power of music is often relegated to the dimension of pathos: that which moves us emotionally. Yet, the music commodity is now situated in and around the liminal spaces of digitality. To think about how music functions, how it argues across media, and how it moves us, we must examine its material and immaterial realities as they present themselves to us and as we so create them. This dissertation rethinks the relationship between rhetoric and music by examining the creation, performance, and distribution of music in its material and immaterial forms to demonstrate its persuasive power. While both Plato and Aristotle understood music as a means to move men toward virtue, Aristotle tells us in his Laws, through the Athenian Stranger, that the very best kinds of music can help guide us to truth. From this starting point, I assess the historical problem of understanding the rhetorical potential of music as merely that which directs or imitates the emotions: that which “Soothes the savage breast,” as William Congreve writes. By furthering work by Vickers and Farnsworth, who suggest that the Baroque fascination with applying rhetorical figures to musical figures is an insufficient framework for assessing the rhetorical potential of music, I demonstrate the gravity of musical persuasion in its political weight, in its violence—the subjective violence of musical torture at Guantanamo and the objective, ideological violence of music—and in what Jacques Attali calls the prophetic nature of music. I argue that music has a significant function, and as a non-discursive form of argumentation, works on us beyond affect. Moreover, with the emergence of digital music distribution and domestic digital recording technologies, the digital music commodity in its material and immaterial forms allows for ruptures in the former methods of musical composition, production, and distribution and in the political potential of music which Jacques Attali describes as being able to foresee new political realities. I thus suggest a new theoretical framework for thinking about rhetoric and music by expanding on Lloyd Bitzer’s rhetorical situation, by offering the idea of “openings” to the existing exigence, audience, and constraints. The prophetic and rhetorical power of music in the aleatoric moment can help provide openings from which new exigencies can be conceived. We must, therefore, reconsider the role of rhetorical-musical composition for the citizen, not merely as a tool for entertainment or emotional persuasion, but as an arena for engaging with the political.
Resumo:
The explosion of multimedia digital content and the development of technologies that go beyond traditional broadcast and TV have rendered access to such content important for all end-users of these technologies. While originally developed for providing access to multimedia digital libraries, video search technologies assume now a more demanding role. In this paper, we attempt to shed light onto this new role of video search technologies, looking at the rapid developments in the related market, the lessons learned from state of art video search prototypes developed mainly in the digital libraries context and the new technological challenges that have risen. We focus on one of the latter, i.e., the development of cross-media decision mechanisms, drawing examples from REVEAL THIS, an FP6 project on the retrieval of video and language for the home user. We argue, that efficient video search holds a key to the usability of the new ”pervasive digital video” technologies and that it should involve cross-media decision mechanisms.
Resumo:
Innovations in hardware and network technologies lead to an exploding number of non-interrelated parallel media streams. Per se this does not mean any additional value for consumers. Broadcasting and advertisement industries have not yet found new formats to reach the individual user with their content. In this work we propose and describe a novel digital broadcasting framework, which allows for the live staging of (mass) media events and improved consumer personalisation. In addition new professions for future TV production workflows which will emerge are described, namely the 'video composer' and the 'live video conductor'.