793 resultados para information security management system


Relevância:

100.00% 100.00%

Publicador:

Resumo:

[ES] E-NATURAL es un portal web donde se localizan empresas del sector del turismo rural, en este portal se publicitan y venden sus productos y servicios. Cada empresa dispone de un espacio web único e individual para poder promocionarse en internet. Mediante un buscador, permite a los usuarios acceder a los contenidos de cada empresa registrada en el sistema. Este buscador es abierto y cualquier usuario no registrado puede consultar la información acerca de productos y servicios ofertados, y disponer de toda la información relacionada con cada empresa. Las empresas registradas disponen de un sistema de información completo de fácil manejo e intuitivo que permite autogestionar todo el contenido de los productos y páginas web de cada empresa individualmente. También se incluye  un sistema de gestión de contenidos que genera páginas web profesionales automáticamente, con posibilidad de edición de páginas. Por otra parte, los usuarios registrados podrán realizar: reservas de productos mediante  un completo sistema de gestión de reservas, con especial atención al alojamiento, compras de productos mediante un completo sistema de compras, adaptado a la plataforma Paypal, clasificaciones de productos y páginas web del sistema, utilizando votaciones mediante rankings. La plataforma contiene un sistema de gestión de comentarios sobre productos y páginas web de empresas que permite seleccionar la visualización y la no visualización del contenido. Por último, los usuarios podrán compartir información sobre contenidos publicados en las páginas, mediante el uso de redes sociales como Twitter, Google+ y Facebook.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Environmental Management includes many components, among which we can include Environmental Management Systems (EMS), Environmental Reporting and Analysis, Environmental Information Systems and Environmental Communication. In this work two applications are presented: the developement and implementation of an Environmental Management System in local administrations, according to the European scheme "EMAS", and the analysis of a territorial energy system through scenario building and environmental sustainability assessment. Both applications are linked by the same objective, which is the quest for more scientifically sound elements; in fact, both EMS and energy planning are oftec carachterized by localism and poor comparability. Emergy synthesis, proposed by ecologist H.T. Odum and described in his book "Environmental Accounting: Emergy and Environmental Decision Making" (1996) has been chosen and applied as an environmental evaluation tool, in order complete the analysis with an assessment of the "global value" of goods and processes. In particular, eMergy syntesis has been applied in order to improve the evaluation of the significance of environmental aspects in an EMS, and in order to evaluate the environmental performance of three scenarios of future evolution of the energy system. Regarding EMS, in this work an application of an EMS together with the CLEAR methodology for environmental accounting is discussed, in order to improve the identification of the environmental aspects; data regarding environmental aspects and significant ones for 4 local authorities are also presented, together with a preliminary proposal for the integration of the assessment of the significance of environmental aspects with eMergy synthesis. Regarding the analysis of an energy system, in this work the carachterization of the current situation is presented together with the overall energy balance and the evaluation of the emissions of greenhouse gases; moreover, three scenarios of future evolution are described and discussed. The scenarios have been realized with the support of the LEAP software ("Long Term Energy Alternatives Planning System" by SEI - "Stockholm Environment Institute"). Finally, the eMergy synthesis of the current situation and of the three scenarios is shown.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this thesis is the study of techniques for efficient management and use of the spectrum based on cognitive radio technology. The ability of cognitive radio technologies to adapt to the real-time conditions of its operating environment, offers the potential for more flexible use of the available spectrum. In this context, the international interest is particularly focused on the “white spaces” in the UHF band of digital terrestrial television. Spectrum sensing and geo-location database have been considered in order to obtain information on the electromagnetic environment. Different methodologies have been considered in order to investigate spectral resources potentially available for the white space devices in the TV band. The adopted methodologies are based on the geo-location database approach used either in autonomous operation or in combination with sensing techniques. A novel and computationally efficient methodology for the calculation of the maximum permitted white space device EIRP is then proposed. The methodology is suitable for implementation in TV white space databases. Different Italian scenarios are analyzed in order to identify both the available spectrum and the white space device emission limits. Finally two different applications of cognitive radio technology are considered. The first considered application is the emergency management. The attention is focused on the consideration of both cognitive and autonomic networking approaches when deploying an emergency management system. The cognitive technology is then considered in applications related to satellite systems. In particular a hybrid cognitive satellite-terrestrial is introduced and an analysis of coexistence between terrestrial and satellite networks by considering a cognitive approach is performed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La tesi si propone di sviluppare un modello, l'architettura e la tecnologia per il sistema di denominazione del Middleware Coordinato TuCSoN, compresi gli agenti, i nodi e le risorse. Identità universali che rappresentano queste entità, sia per la mobilità fisica sia per quella virtuale, per un Management System (AMS, NMS, RMS) distribuito; tale modulo si occupa anche di ACC e trasduttori, prevedendo questioni come la tolleranza ai guasti, la persistenza, la coerenza, insieme con il coordinamento disincarnata in rete, come accade con le tecnologie Cloud. All’interno dell’elaborato, per prima cosa si è fatta una introduzione andando a descrivere tutto ciò che è contenuto nell’elaborato in modo da dare una visione iniziale globale del lavoro eseguito. Di seguito (1° capitolo) si è descritta tutta la parte relativa alle conoscenze di base che bisogna avere per la comprensione dell’elaborato; tali conoscenze sono relative a TuCSoN (il middleware coordinato con cui il modulo progettato dovrà interfacciarsi) e Cassandra (sistema server distribuito su cui si appoggia la parte di mantenimento e salvataggio dati del modulo). In seguito (2° capitolo) si è descritto JADE, un middleware da cui si è partiti con lo studio per la progettazione del modello e dell’architettura del modulo. Successivamente (3° capitolo) si è andati a spiegare la struttura e il modello del modulo considerato andando ad esaminare tutti i dettagli relativi alle entità interne e di tutti i legami fra esse. In questa parte si è anche dettagliata tutta la parte relativa alla distribuzione sulla rete del modulo e dei suoi componenti. In seguito (4° capitolo) è stata dettagliata e spiegata tutta la parte relativa al sistema di denominazione del modulo, quindi la sintassi e l’insieme di procedure che l’entità consumatrice esterna deve effettuare per ottenere un “nome universale” e quindi anche tutti i passaggi interni del modulo per fornire l’identificatore all’entità consumatrice. Nel capitolo successivo (5° capitolo) si sono descritti tutti i casi di studio relativi alle interazioni con le entità esterne, alle entità interne in caso in cui il modulo sia o meno distribuito sulla rete, e i casi di studio relativi alle politiche, paradigmi e procedure per la tolleranza ai guasti ed agli errori in modo da dettagliare i metodi di riparazione ad essi. Successivamente (6° capitolo) sono stati descritti i possibili sviluppi futuri relativi a nuove forme di interazione fra le entità che utilizzano questo modulo ed alle possibili migliorie e sviluppi tecnologici di questo modulo. Infine sono state descritte le conclusioni relative al modulo progettato con tutti i dettagli in modo da fornire una visione globale di quanto inserito e descritto nell’elaborato.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis concerns artificially intelligent natural language processing systems that are capable of learning the properties of lexical items (properties like verbal valency or inflectional class membership) autonomously while they are fulfilling their tasks for which they have been deployed in the first place. Many of these tasks require a deep analysis of language input, which can be characterized as a mapping of utterances in a given input C to a set S of linguistically motivated structures with the help of linguistic information encoded in a grammar G and a lexicon L: G + L + C → S (1) The idea that underlies intelligent lexical acquisition systems is to modify this schematic formula in such a way that the system is able to exploit the information encoded in S to create a new, improved version of the lexicon: G + L + S → L' (2) Moreover, the thesis claims that a system can only be considered intelligent if it does not just make maximum usage of the learning opportunities in C, but if it is also able to revise falsely acquired lexical knowledge. So, one of the central elements in this work is the formulation of a couple of criteria for intelligent lexical acquisition systems subsumed under one paradigm: the Learn-Alpha design rule. The thesis describes the design and quality of a prototype for such a system, whose acquisition components have been developed from scratch and built on top of one of the state-of-the-art Head-driven Phrase Structure Grammar (HPSG) processing systems. The quality of this prototype is investigated in a series of experiments, in which the system is fed with extracts of a large English corpus. While the idea of using machine-readable language input to automatically acquire lexical knowledge is not new, we are not aware of a system that fulfills Learn-Alpha and is able to deal with large corpora. To instance four major challenges of constructing such a system, it should be mentioned that a) the high number of possible structural descriptions caused by highly underspeci ed lexical entries demands for a parser with a very effective ambiguity management system, b) the automatic construction of concise lexical entries out of a bulk of observed lexical facts requires a special technique of data alignment, c) the reliability of these entries depends on the system's decision on whether it has seen 'enough' input and d) general properties of language might render some lexical features indeterminable if the system tries to acquire them with a too high precision. The cornerstone of this dissertation is the motivation and development of a general theory of automatic lexical acquisition that is applicable to every language and independent of any particular theory of grammar or lexicon. This work is divided into five chapters. The introductory chapter first contrasts three different and mutually incompatible approaches to (artificial) lexical acquisition: cue-based queries, head-lexicalized probabilistic context free grammars and learning by unification. Then the postulation of the Learn-Alpha design rule is presented. The second chapter outlines the theory that underlies Learn-Alpha and exposes all the related notions and concepts required for a proper understanding of artificial lexical acquisition. Chapter 3 develops the prototyped acquisition method, called ANALYZE-LEARN-REDUCE, a framework which implements Learn-Alpha. The fourth chapter presents the design and results of a bootstrapping experiment conducted on this prototype: lexeme detection, learning of verbal valency, categorization into nominal count/mass classes, selection of prepositions and sentential complements, among others. The thesis concludes with a review of the conclusions and motivation for further improvements as well as proposals for future research on the automatic induction of lexical features.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this work is to develop a prototype of an e-learning environment that can foster Content and Language Integrated Learning (CLIL) for students enrolled in an aircraft maintenance training program, which allows them to obtain a license valid in all EU member states. Background research is conducted to retrace the evolution of the field of educational technology, analyzing different learning theories – behaviorism, cognitivism, and (socio-)constructivism – and reflecting on how technology and its use in educational contexts has changed over time. Particular attention is given to technologies that have been used and proved effective in Computer Assisted Language Learning (CALL). Based on the background research and on students’ learning objectives, i.e. learning highly specialized contents and aeronautical technical English, a bilingual approach is chosen, three main tools are identified – a hypertextbook, an exercise creation activity, and a discussion forum – and the learning management system Moodle is chosen as delivery medium. The hypertextbook is based on the technical textbook written in English students already use. In order to foster text comprehension, the hypertextbook is enriched by hyperlinks and tooltips. Hyperlinks redirect students to webpages containing additional information both in English and in Italian, while tooltips show Italian equivalents of English technical terms. The exercise creation activity and the discussion forum foster interaction and collaboration among students, according to socio-constructivist principles. In the exercise creation activity, students collaboratively create a workbook, which allow them to deeply analyze and master the contents of the hypertextbook and at the same time create a learning tool that can help them, as well as future students, to enhance learning. In the discussion forum students can discuss their individual issues, content-related, English-related or e-learning environment-related, helping one other and offering instructors suggestions on how to improve both the hypertextbook and the workbook based on their needs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PhEDEx, the CMS transfer management system, during the first LHC Run has moved about 150 PB and currently it is moving about 2.5 PB of data per week over the Worldwide LHC Computing Grid (WLGC). It was designed to complete each transfer required by users at the expense of the waiting time necessary for its completion. For this reason, after several years of operations, data regarding transfer latencies has been collected and stored into log files containing useful analyzable informations. Then, starting from the analysis of several typical CMS transfer workflows, a categorization of such latencies has been made with a focus on the different factors that contribute to the transfer completion time. The analysis presented in this thesis will provide the necessary information for equipping PhEDEx in the future with a set of new tools in order to proactively identify and fix any latency issues. PhEDEx, il sistema di gestione dei trasferimenti di CMS, durante il primo Run di LHC ha trasferito all’incirca 150 PB ed attualmente trasferisce circa 2.5 PB di dati alla settimana attraverso la Worldwide LHC Computing Grid (WLCG). Questo sistema è stato progettato per completare ogni trasferimento richiesto dall’utente a spese del tempo necessario per il suo completamento. Dopo svariati anni di operazioni con tale strumento, sono stati raccolti dati relativi alle latenze di trasferimento ed immagazzinati in log files contenenti informazioni utili per l’analisi. A questo punto, partendo dall’analisi di una ampia mole di trasferimenti in CMS, è stata effettuata una suddivisione di queste latenze ponendo particolare attenzione nei confronti dei fattori che contribuiscono al tempo di completamento del trasferimento. L’analisi presentata in questa tesi permetterà di equipaggiare PhEDEx con un insieme di utili strumenti in modo tale da identificare proattivamente queste latenze e adottare le opportune tattiche per minimizzare l’impatto sugli utenti finali.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The our reality is characterized by a constant progress and, to follow that, people need to stay up to date on the events. In a world with a lot of existing news, search for the ideal ones may be difficult, because the obstacles that make it arduous will be expanded more and more over time, due to the enrichment of data. In response, a great help is given by Information Retrieval, an interdisciplinary branch of computer science that deals with the management and the retrieval of the information. An IR system is developed to search for contents, contained in a reference dataset, considered relevant with respect to the need expressed by an interrogative query. To satisfy these ambitions, we must consider that most of the developed IR systems rely solely on textual similarity to identify relevant information, defining them as such when they include one or more keywords expressed by the query. The idea studied here is that this is not always sufficient, especially when it's necessary to manage large databases, as is the web. The existing solutions may generate low quality responses not allowing, to the users, a valid navigation through them. The intuition, to overcome these limitations, has been to define a new concept of relevance, to differently rank the results. So, the light was given to Temporal PageRank, a new proposal for the Web Information Retrieval that relies on a combination of several factors to increase the quality of research on the web. Temporal PageRank incorporates the advantages of a ranking algorithm, to prefer the information reported by web pages considered important by the context itself in which they reside, and the potential of techniques belonging to the world of the Temporal Information Retrieval, exploiting the temporal aspects of data, describing their chronological contexts. In this thesis, the new proposal is discussed, comparing its results with those achieved by the best known solutions, analyzing its strengths and its weaknesses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Several non-invasive and novel aids for the detection of (and in some cases monitoring of) caries lesions have been introduced in the field of 'caries diagnostics' over the last 15 years. This chapter focusses on those available to dentists at the time of writing; continuing research is bound to lead to further developments in the coming years. Laser fluorescence is based on measurements of back-scattered fluorescence of a 655-nm light source. It enhances occlusal and (potentially) approximal lesion detection and enables semi-quantitative caries monitoring. Systematic reviews have identified false-positive results as a limitation. Quantitative light-induced fluorescence is another sensitive method to quantitatively detect and measure mineral loss both in enamel and some dentine lesions; again, the trade-offs with lower specificity when compared with clinical visual detection must be considered. Subtraction radiography is based on the principle of digitally superimposing two radiographs with exactly the same projection geometry. This method is applicable for approximal surfaces and occlusal caries involving dentine but is not yet widely available. Electrical caries measurements gather either site-specific or surface-specific information of teeth and tooth structure. Fixed-frequency devices perform best for occlusal dentine caries but the method has also shown promise for lesions in enamel and other tooth surfaces with multi-frequency approaches. All methods require further research and further validation in well-designed clinical trials. In the future, they could have useful applications in clinical practice as part of a personalized, comprehensive caries management system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Der CampusSource Workshop fand vom 10. bis 12. Oktober 2006 an der Westfälischen Wilhelms Universität (WWU) in Münster statt. Kernpunkte der Veranstaltung waren die Entwicklung einer Engine zur Verknüpfung von e-Learning Anwendungen mit Systemen der HIS GmbH und die Erstellung von Lehr- und Lerninhalten mit dem Ziel der Wiederverwendung. Im zweiten Kapitel sind Vorträge der Veranstaltung im Adobe Flash Format zusammengetragen. Zur Betrachtung der Vorträge ist der Adobe Flash Player, mindestens in der Version 6 erforderlich

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this article the use of Learning Management Systems (LMS) at the School of Engineering, University of Borås, in the year 2004 and the academic year 2009-2010 is investigated. The tools in the LMS were classified into four groups (tools for distribution, tools for communication, tools for interaction and tools for course administration) and the pattern of use was analyzed. The preliminary interpretation of the results was discussed with a group of teachers from the School of Engineering with long experience of using LMS. High expectations about LMS as a tool to facilitate flexible education, student centered methods and the creation of an effective learning environment is abundant in the literature. This study, however, shows that in most of the surveyed courses the available LMS is predominantly used to distribute documents to students. The authors argue that a more elaborate use of LMS and a transformation of pedagogical practices towards social constructivist, learner centered procedures should be treated as an integrated process of professional development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of virtual learning environments in Higher Education (HE) has been growing in Portugal, driven by the Bologna Process. An example is the use of Learning Management Systems (LMS) that translates an opportunity to leverage the use of technological advances in the educational process. The progress of information and communication technologies (ICT) coupled with the great development of Internet has brought significant challenges to educators that require a thorough knowledge of their implementation process. These field notes present the results of a survey among teachers of a private HE institution in its use of Moodle as a tool to support face-to-face teaching. A research methodology essentially of exploratory nature based on a questionnaire survey, supported by statistical treatment allowed to detect motivations, type of use and perceptions of teachers in relation to this kind of tool. The results showed that most teachers, by a narrow margin (58%), had not changed their pedagogical practice as a consequence of using Moodle. Among those that did 67% attended institutional internal training. Some of the results obtained suggest further investigation and provide guidelines to plan future internal training.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of the web-based course “Advertising Psychology – The Blog Seminar” was to offer a contemporary teaching design using typical Web 2.0 characteristics such as comments, discussions and social media integration which covers facebook and Twitter support, as nowadays, this is a common part of students’ everyday life. This weblog (blog)-based seminar for Advertising Psychology was set up in order to make the course accessible to students from different campuses in the Ruhr metropolitan area. The technical aspect of the open-source content management system Drupal 6.0 and the didactical course structure, based on Merrill’s five first principles of instruction, are introduced. To date, this blog seminar has been conducted three times with a total of 84 participants, who were asked to rate the course according to the benefits of different didactical elements and with regard to Kirkpatrick’s levels of evaluation model. This model covers a) reactions such as reported enjoyment, perceived usefulness and perceived difficulty, and b) effects on learning through the subjectively reported increase in knowledge and attitude towards the seminar. Overall, the blog seminar was evaluated very positively and can be considered as providing support for achieving the learning objectives. However, a successful blended learning approach should always be tailored to the learning contents and the environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Die Verwaltung von Lagerbeständen in Unternehmen muss erhebliche Anforderungen an die Datenverfügbarkeit, -sicherheit und -konsistenz erfüllen. Dies wird heute durch eine zentrale Datenhaltung in Lagerverwaltungssystemen gewährleistet. Auf der anderen Seite ist in vielen Bereichen (z. B. Materialfluss- und Transportsteuerung, Produktionssteuerung) eine Entwicklungstendenz in Richtung dezentraler Steuerungsstrategien zu beobachten, welche eine erhöhte Flexibilität und reduzierte Komplexität versprechen. Im Rahmen eines von der Deutschen Forschungsgemeinschaft (DFG) geförderten Projekts werden im vorliegenden Beitrag Konzepte zur verteilten Gestaltung von Lagerverwaltungssystemen vorgestellt und diskutiert.