877 resultados para Technologies and libraries
Resumo:
This work will explore and motivate perspectives and research issues related with the applications of automated planning technologies in order to support innovative web applications. The target for the technology transfer, i.e. the web, and, in a broader sense, the new Information Technologies (IT) is one of the most changing, evolving and hottest areas of current computer science. Nevertheless many sub-area in this field could have potential benefits from Planning and Scheduling (P&S) technologies, and, in some cases, technology transfer has already started. This paper will consider and explore a set of topics, guidelines and objectives in order to implement the technology transfer a new challenges, requirements and research issues for planning which emerge from the web and IT industry. Sample scenarios will be depicted to clarify the potential applications and limits of current planning technology. Finally we will point out some new P&S research challenge issues which are required to meet more advanced applicative goals.
Resumo:
The purpose of this paper is to introduce Digital Rights Management (DRM) and its implications for content producers, consumers, and libraries. Simply stated, DRM is a technology that allows copyright owners to regulate and manage their content when it is disseminated in a digital format, and it is the reason why some patrons cannot access some of the downloadable digital content provided by libraries. In the first part of this paper, we provide a short introduction to DRM by outlining the entities, the various technologies used as well as usage restrictions that come with DRM. In the second part of the paper are discussed the alternatives for the libraries, using DRM as a tool for library copyright policy and the main documents, which present the position of library organizations towards information legislation.
Resumo:
This thesis examines digital technologies policies designed for Australian schools and the ways they are understood and interpreted by students, school staff, teachers, principals and policy writers. This study explores the ways these research participant groups interpret and understand the ethical dimension of schools digital technologies policies for teaching and learning. In this thesis the ethical dimension is considered to be a dynamic concept which encompasses various elements including; decisions, actions, values, issues, debates, education, discourses, and notions of right and wrong, in relation to ethics and uses of digital technologies in schools. In this study policy is taken to mean not only written texts but discursive processes, policy documents including national declarations, strategic plans and acceptable use policies to guide the use of digital technologies in schools. The research is situated in the context of changes that have occurred in Australia and internationally over the last decade that have seen a greater focus on the access to and use of digital technologies in schools. In Australian school education, the attention placed on digital technologies in schools has seen the release of policies at the national, state, territory, education office and school levels, to guide their use. Prominent among these policies has been the Digital Education Revolution policy, launched in 2007 and concluded in 2013. This research aims to answers the question: What does an investigation reveal about understandings of the ethical dimension of digital technologies policies and their implementation in school education? The objective of this research is to examine the ethical dimension of digital technologies policies and to interpret and understand the responses of the research participants to the issues, silences, discourses and language, which characterise this dimension. In doing so, it is intended that the research can allow the participants to have a voice that, may be different to the official discourses located in digital technologies policies. The thesis takes a critical and interpretative approach to policies and examines the role of digital technologies policies as discourse. Interpretative theory is utilised as it provides a conceptual lens from which to interpret different perspectives and the implications of these in the construction of meaning in relation to schools digital technologies policies. Critical theory is used in tandem with interpretative theory as it represents a conceptual basis from which to critique and question underlying assumptions and discourses that are associated with the ethical dimension of schools digital technologies policies. The research methods used are semi-structured interviews and policy document analysis. Policies from the national, state, territory, education office and school level were analysed and contribute to understanding the way the ethical dimension of digital technologies policies is represented as a discourse. Students, school staff, teachers, principals and policy writers participated in research interviews and their views and perspectives were canvassed in relation to the ethical use of digital technologies and the policies that are designed to regulate their use. The thesis presents an argument that the ethical dimension of schools digital technologies policies and use is an under-researched area, and there are gaps in understanding and knowledge in the literature which remain to be addressed. It is envisaged that the thesis can make a meaningful contribution to understand the ways in which schools digital technologies policies are understood in school contexts. It is also envisaged that the findings from the research can inform policy development by analysing the voices and views of those in schools. The findings of the policy analysis revealed that there is little attention given to the ethical dimension in digital technologies at the national level. A discourse of compliance and control pervades digital technologies policies from the state, education office and school levels, which reduces ethical considerations to technical, legal and regulatory requirements. The discourse is largely instrumentalist and neglects the educative dimension of digital technologies which has the capacity to engender their ethical use. The findings from the interview conversations revealed that students, school staff and teachers perceive digital technologies policies to be difficult to understand, and not relevant to their situation and needs. They also expressed a desire to have greater consultation and participation in the formation and enactment of digital technologies policies, and they believe they are marginalised from these processes in their schools. Arising from the analysis of the policies and interview conversations, an argument is presented that in the light of the prominent role played by digital technologies and their potential for enhancing all aspects of school education, more research is required to provide a more holistic and richer understanding of the policies that are constructed to control and mediate their use.
Resumo:
The integration of Information and Communication Technologies (ICT) in the tourism industry is an essential element for the success of any tourism enterprise. ICTs provide access to information of tourism products from anywhere and at any time. Tour companies may also reach out to target customers around the world through a series of emerging technologies. This paper aims to make a review of the main key factors of ICT in Tourism. Aspects such as the quality of the website, Digital Marketing, Social Networking, Multimedia, Mobile Technologies and Intelligent Environments are discussed.
Resumo:
In 2013, a series of posters began appearing in Washington, DCs Metro system. Each declared The internet: Your future depends on it next to a photo of a middle-aged black Washingtonian, and an advertisement for the municipal governments digital training resources. This hopeful discourse is familiar but where exactly does it come from? And how are our public institutions reorganized to approach the problem of poverty as a problem of technology? The Clinton administrations digital divide policy program popularized this hopeful discourse about personal computing powering social mobility, positioned internet startups as the right side of the divide, and charged institutions of social reproduction such as schools and libraries with closing the gap and upgrading themselves in the image of internet startups. After introducing the development regime that builds this idea into the urban landscape through what I call the political economy of hope, and tracing the origin of the digital divide frame, this dissertation draws on three years of comparative ethnographic fieldwork in startups, schools, and libraries to explore how this hope is reproduced in daily life, becoming the common sense that drives our understanding of and interaction with economic inequality and reproduces that inequality in turn. I show that the hope in personal computing to power social mobility becomes a method of securing legitimacy and resources for both white migr technologists and institutions of social reproduction struggling to understand and manage the persistent poverty of the information economy. I track the movement of this common sense between institutions, showing how the political economy of hope transforms them as part of a larger development project. This dissertation models a new, relational direction for digital divide research that grounds the politics of economic inequality with an empirical focus on technologies of poverty management. It demands a conceptual shift that sees the digital divide not as a bug within the information economy, but a feature of it.
Resumo:
Dairy cattle farms have a well-known environmental impact that affects all ecological compartments: air, soil, water and biosphere [1]. Dairy cattle farming are a significant source of anthropogenic gases from enteric fermentation, manure storage and land application, mainly ammonia (NH3), nitric oxide (NO), nitrous oxide (N2O), carbon dioxide (CO2) and methane (CH4). The emission of such gases represents not only an environmental problem but also leads to energy and nitrogen (N) losses in ruminant production systems [2-5]. Several efforts are required on the development of new technologies and strategies that mitigate gaseous emissions, N losses and improve the efficiency of the energy and N cycles [6, 7]. In the Northwest of Portugal, dairy cattle production has a major impact on the economy, with strong repercussions at national scale. Therefore, our Ph.D. thesis project aims to: a) Study natural supplements as additives in the dairy cattle diet towards a decrease in GHG emissions from feeding operations; b) Compare commercial dairy cattle diets with and without additives on gaseous emissions from manure deposited in a simulated concrete floor; c) Assess the concentrations and emissions of NH3 and greenhouse gases from commercial dairy cattle facilities; d) Evaluate the effects of different additives on lowering gaseous emissions from dairy cattle excreta, using a laboratory system simulating a dairy house concrete floor.
Resumo:
The multi-faced evolution of network technologies ranges from big data centers to specialized network infrastructures and protocols for mission-critical operations. For instance, technologies such as Software Defined Networking (SDN) revolutionized the world of static configuration of the network - i.e., by removing the distributed and proprietary configuration of the switched networks - centralizing the control plane. While this disruptive approach is interesting from different points of view, it can introduce new unforeseen vulnerabilities classes. One topic of particular interest in the last years is industrial network security, an interest which started to rise in 2016 with the introduction of the Industry 4.0 (I4.0) movement. Networks that were basically isolated by design are now connected to the internet to collect, archive, and analyze data. While this approach got a lot of momentum due to the predictive maintenance capabilities, these network technologies can be exploited in various ways from a cybersecurity perspective. Some of these technologies lack security measures and can introduce new families of vulnerabilities. On the other side, these networks can be used to enable accurate monitoring, formal verification, or defenses that were not practical before. This thesis explores these two fields: by introducing monitoring, protections, and detection mechanisms where the new network technologies make it feasible; and by demonstrating attacks on practical scenarios related to emerging network infrastructures not protected sufficiently. The goal of this thesis is to highlight this lack of protection in terms of attacks on and possible defenses enabled by emerging technologies. We will pursue this goal by analyzing the aforementioned technologies and by presenting three years of contribution to this field. In conclusion, we will recapitulate the research questions and give answers to them.
Resumo:
The ability to create hybrid systems that blend different paradigms has now become a requirement for complex AI systems usually made of more than a component. In this way, it is possible to exploit the advantages of each paradigm and exploit the potential of different approaches such as symbolic and non-symbolic approaches. In particular, symbolic approaches are often exploited for their efficiency, effectiveness and ability to manage large amounts of data, while symbolic approaches are exploited to ensure aspects related to explainability, fairness, and trustworthiness in general. The thesis lies in this context, in particular in the design and development of symbolic technologies that can be easily integrated and interoperable with other AI technologies. 2P-Kt is a symbolic ecosystem developed for this purpose, it provides a logic-programming (LP) engine which can be easily extended and customized to deal with specific needs. The aim of this thesis is to extend 2P-Kt to support constraint logic programming (CLP) as one of the main paradigms for solving highly combinatorial problems given a declarative problem description and a general constraint-propagation engine. A real case study concerning school timetabling is described to show a practical usage of the CLP(FD) library implemented. Since CLP represents only a particular scenario for extending LP to domain-specific scenarios, in this thesis we present also a more general framework: Labelled Prolog, extending LP with labelled terms and in particular labelled variables. The designed framework shows how it is possible to frame all variations and extensions of LP under a single language reducing the huge amount of existing languages and libraries and focusing more on how to manage different domain needs using labels which can be associated with every kind of term. Mapping of CLP into Labeled Prolog is also discussed as well as the benefits of the provided approach.
Targeted! Population segmentation, electronic surveillance and governing the unemployed in Australia
Resumo:
Targeting is increasingly used to manage people. It operates by segmenting populations and providing different levels of opportunities and services to these groups. Each group is subject to different levels of surveillance and scrutiny. This article examines the deployment of targeting in Australian social security. Three case studies of targeting are presented in Australia's management of benefit overpayment and fraud, the distribution of employment services and the application of workfare. In conceptualizing surveillance as governance, the analysis examines the rationalities, technologies and practices that make targeting thinkable, practicable and achievable. In the case studies, targeting is variously conceptualized and justified by calculative risk discourses, moral discourses of obligation and notions of welfare dependency Advanced information technologies are also seen as particularly important in giving rise to the capacity to think about and act on population segments.
Resumo:
Based on a comprehensive study in research centers and libraries, a panorama of the critical response to Argentine rock in academic circles has been traced in various fields of knowledge. Research works including books, trade magazines, and academic papers on this musical movement spanning four decades have been consulted. The results show that the criticism has generated a ""developing tradition"" around the movement, mainly in the Social and Communication Sciences, and it has presented the following predominant traits: approaching rock essentially as a determinant of ""social identities""; periodization in the genre according to political regime: and analysis of the lyrics based solely on content. These traits contribute to a delimitation of rock as an exclusive phenomenon of the mass media, which relegates to secondary importance its aesthetic function and its relationship with other artistic series. Furthermore, we have observed a scarcity of approaches to rock lyrics as a linguistic-discursive surface.
Resumo:
Breeding methodologies for cultivated lucerne (Medicago sativa L.), an autotetraploid, have changed little over the last 50 years, with reliance on polycross methods and recurrent phenotypic selection. There has been, however, an increase in our understanding of lucerne biology, in particular the genetic relationships between members of the M. sativa complex, as deduced by DNA analysis. Also, the differences in breeding behaviour and vigour of diploids versus autotetraploids, and the underlying genetic causes, are discussed in relation to lucerne improvement. Medicago falcata, a member of the M. sativa complex, has contributed substantially to lucerne improvement in North America, and its diverse genetics would appear to have been under-utilised in Australian programs over the last two decades, despite the reduced need for tolerance to freezing injury in Australian environments. Breeding of lucerne in Australia only commenced on a large scale in 1977, driven by an urgent need to introgress aphid resistance into adapted backgrounds. The release in the early 1980s of lucernes with multiple pest and disease resistance (aphids, Phytophthora, Colletotrichum) had a significant effect on increasing lucerne productivity and persistence in eastern Australia, with yield increases under high disease pressure of up to 300% being recorded over the predominant Australian cultivar, up to 1977, Hunter River. Since that period, irrigated lucerne yields have plateaued, highlighting the need to identify breeding objectives, technologies, and the germplasm that will create new opportunities for increasing performance. This review discusses major goals for lucerne improvement programs in Australia, and provides indications of the germplasm sources and technologies that are likely to deliver the desired outcomes.
Resumo:
Arguably the most complex conical functions are seated in human cognition, the how and why of which have been debated for centuries by theologians, philosophers and scientists alike. In his best-selling book, An Astonishing Hypothesis: A Scientific Search for the Soul, Francis Crick refined the view that these qualities are determined solely by cortical cells and circuitry. Put simply, cognition is nothing more, or less, than a biological function. Accepting this to be the case, it should be possible to identify the mechanisms that subserve cognitive processing. Since the pioneering studies of Lorent de No and Hebb, and the more recent studies of Fuster, Miller and Goldman-Rakic, to mention but a few, much attention has been focused on the role of persistent neural activity in cognitive processes. Application of modern technologies and modelling techniques has led to new hypotheses about the mechanisms of persistent activity. Here I focus on how regional variations in the pyramidal cell phenotype may determine the complexity of cortical circuitry and, in turn, influence neural activity. Data obtained from thousands of individually injected pyramidal cells in sensory, motor, association and executive cortex reveal marked differences in the numbers of putative excitatory inputs received by these cells. Pyramidal cells in prefrontal cortex have, on average, up to 23 times more dendritic spines than those in the primary visual area. I propose that without these specializations in the structure of pyramidal cells, and the circuits they form, human cognitive processing would not have evolved to its present state. I also present data from both New World and Old World monkeys that show varying degrees of complexity in the pyramidal cell phenotype in their prefrontal cortices, suggesting that cortical circuitry and, thus, cognitive styles are evolving independently in different species.
Resumo:
Today, information overload and the lack of systems that enable locating employees with the right knowledge or skills are common challenges that large organisations face. This makes knowledge workers to re-invent the wheel and have problems to retrieve information from both internal and external resources. In addition, information is dynamically changing and ownership of data is moving from corporations to the individuals. However, there is a set of web based tools that may cause a major progress in the way people collaborate and share their knowledge. This article aims to analyse the impact of Web 2.0 on organisational knowledge strategies. A comprehensive literature review was done to present the academic background followed by a review of current Web 2.0 technologies and assessment of their strengths and weaknesses. As the framework of this study is oriented to business applications, the characteristics of the involved segments and tools were reviewed from an organisational point of view. Moreover, the Enterprise 2.0 paradigm does not only imply tools but also changes the way people collaborate, the way the work is done (processes) and finally impacts on other technologies. Finally, gaps in the literature in this area are outlined.
Resumo:
In this review paper different designs based on stacked p-i'-n-p-i-n heterojunctions are presented and compared with the single p-i-n sensing structures. The imagers utilise self-field induced depletion layers for light detection and a modulated laser beam for sequential readout. The effect of the sensing element structure, cell configurations (single or tandem), and light source properties (intensity and wavelength) are correlated with the sensor output characteristics (light-to-dark sensivity, spatial resolution, linearity and S/N ratio). The readout frequency is optimized showing that scans speeds up to 104 lines per second can be achieved without degradation in the resolution. Multilayered p-i'-n-p-i-n heterostructures can also be used as wavelength-division multiplexing /demultiplexing devices in the visible range. Here the sensor element faces the modulated light from different input colour channels, each one with a specific wavelength and bit rate. By reading out the photocurrent at appropriated applied bias, the information is multiplexed or demultiplexed and can be transmitted or recovered again. Electrical models are present to support the sensing methodologies.
Resumo:
An increasing amount of research is being developed in the area where technology and humans meet. The success or failure of technologies and the question whether technology helps humans to fulfill their goals or whether it hinders them is in most cases not a technical one. User Perception and Influencing Factors of Technology in Everyday Life addresses issues of human and technology interaction. The research in this work is interdisciplinary, ranging from more technical subjects such as computer science, engineering, and information systems, to non-technical descriptions of technology and human interaction from the point of view of sociology or philosophy. This book is perfect for academics, researchers, and professionals alike as it presents a set of theories that allow us to understand the interaction of technology and humans and to put it to practical use.