987 resultados para Internet services


Relevância:

60.00% 60.00%

Publicador:

Resumo:

El Proyecto Final de Carrera(PFC)Implementación de Ingeniería Virtual con Joomla! tiene como objetivo la creación de una plataforma web. Para desarrollar un proyecto de ingeniería multidisciplinar, basado en el trabajo en red, grupos de trabajo y el trabajo flexible. El trabajo en red es desempeñar el trabajo por medio de las Tecnología de la Información y la comunicación (TIC). Los grupos de trabajo están compuestos por personas multidisciplinares, multirraciales, de diferentes religiones, situados en husos horarios distintos y multiculturales donde la colaboración, flexibilidad y la compartición de recursos están a la orden del día. La flexible es la capacidad de adaptación de los propios trabajadores a la demanda de la productividad, los responsables depositan sobre ellos su confianza, recibiendo el trabajo terminado en forma y fecha. Estos trabajadores no necesitan una supervisión constante ni un sitio fijo donde realizar su trabajo. Todo lo que necesitan esta en la red, la información que necesitan como las herramientas. Convirtiéndose este tipo de trabajador en teletrabajadores. Estos trabajadores utilizan de forma intensiva sus conocimientos, no se puede permitir quedarse obsoletos en su conocimientos, sería su gran desgracia. Por está razón, necesitan estar formándose continuamente, aprendiendo y conociendo las nuevas tecnologías que aparecen. Con el objetivo de conseguir nuevas líneas de negocio, con el fin de lograr nuevos ingresos. Los trabajadores que hacen un uso intensivo en la tecnología de la información y comunicación, se caracterizan por la continua innovación y cambio tecnológico. Estos trabajadores necesitan una red profesional, social amplia con enlaces fuertes y poderosos. Las redes son importantes, para estar actualizado con las innovaciones que se realizan en las empresas, optar a nuevos puesto de trabajo, curso en nuevas tecnologías… Gracias a los servicios actuales en Internet facilitan mantener vivos una gran cantidad de enlaces (contactos), en comparación con otras épocas. La plataforma propuesta en este proyecto final de carrera esta compuesta de todas las herramientas necesarias para que estos trabajadores puedan desarrollar su actividad y mantenimiento de sus redes profesionales. Abstract: The aim of this Final Project of Career, Implementation of Virtual Engineering with Joomla!, is to create a web software application where a multidisciplinary engineering project bases on the networking, working groups and the flexible working can be implemented. The networking is the job through the Information Technology and Communication (ITC) where working groups compounded of multidisciplinary and multiracial professions, different religions and located in different time zones are created. The multicultural environment, collaboration, flexibility and to share resources are the order of the day on this kind of groups. The flexibility is the ability to adaptability of workers to the productivity demand, with the trust which is placed on them by supervisor people who wait to receive the work completed in a specific form and date. These workers do not need either constant supervision or a fixed site where to do the job. Everything the workers need is on the network, as the information as the tools, that is why they become teleworkers. These workers demand a high use of their knowledge, so it can not be allowed to become obsolete. This would be a great misfortune. That is why they need to continue learning and knowing the new technologies emerging with the aim of getting new revenues. Workers do an intensive use of the information technology and communication, characterized by continuous innovation and technological change. These workers need a broad social and professional network with great power. This network is important to keep updated with innovations taking place at the companies, to apply for a new job, a new technology course etc.. Thanks to Internet services a bigger number of contacts are provided compared to earlier times. The software application of this project is compounded with enough tools with the aim of the workers can carry out their activity and maintenance of the links on their professional nets.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper describes a mobile-based system to interact with objects in smart spaces, where the offer of resources may be extensive. The underlying idea is to use the augmentation capabilities of the mobile device to enable it as user-object mediator. In particular, the paper details how to build an attitude-based reasoning strategy that facilitates user-object interaction and resource filtering. The strategy prioritizes the available resources depending on the spatial history of the user, his real-time location and orientation and, finally, his active touch and focus interactions with the virtual overlay. The proposed reasoning method has been partially validated through a prototype that handles 2D and 3D visualization interfaces. This framework makes possible to develop in practice the IoT paradigm, augmenting the objects without physically modifying them.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The presented work aims to contribute towards the standardization and the interoperability off the Future Internet through an open and scalable architecture design. We present S³OiA as a syntactic/semantic Service-Oriented Architecture that allows the integration of any type of object or device, not mattering their nature, on the Internet of Things. Moreover, the architecture makes possible the use of underlying heterogeneous resources as a substrate for the automatic composition of complex applications through a semantic Triple Space paradigm. Created applications are dynamic and adaptive since they are able to evolve depending on the context where they are executed. The validation scenario of this architecture encompasses areas which are prone to involve human beings in order to promote personal autonomy, such as home-care automation environments and Ambient Assisted Living.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La informática se está convirtiendo en la quinta utilidad (gas, agua, luz, teléfono) en parte debido al impacto de Cloud Computing en las mayorías de las organizaciones. Este uso de informática es usada por cada vez más tipos de sistemas, incluidos Sistemas Críticos. Esto tiene un impacto en la complejidad internad y la fiabilidad de los sistemas de la organización y los que se ofrecen a los clientes. Este trabajo investiga el uso de Cloud Computing por sistemas críticos, centrándose en las dependencias y especialmente en la fiabilidad de estos sistemas. Se han presentado algunos ejemplos de su uso, y aunque su utilización en sistemas críticos no está extendido, se presenta cual puede llegar a ser su impacto. El objetivo de este trabajo es primero definir un modelo que pueda representar de una forma cuantitativa las interdependencias en fiabilidad y interdependencia para las organizaciones que utilicen estos sistemas, y aplicar este modelo en un sistema crítico del campo de sanidad y mostrar sus resultados. Los conceptos de “macro-dependability” y “micro-dependability” son introducidos en el modelo para la definición de interdependencia y para analizar la fiabilidad de sistemas que dependen de otros sistemas. ABSTRACT With the increasing utilization of Internet services and cloud computing by most organizations (both private and public), it is clear that computing is becoming the 5th utility (along with water, electricity, telephony and gas). These technologies are used for almost all types of systems, and the number is increasing, including Critical Infrastructure systems. Even if Critical Infrastructure systems appear not to rely directly on cloud services, there may be hidden inter-dependencies. This is true even for private cloud computing, which seems more secure and reliable. The critical systems can began in some cases with a clear and simple design, but evolved as described by Egan to "rafted" networks. Because they are usually controlled by one or few organizations, even when they are complex systems, their dependencies can be understood. The organization oversees and manages changes. These CI systems have been affected by the introduction of new ICT models like global communications, PCs and the Internet. Even virtualization took more time to be adopted by Critical systems, due to their strategic nature, but once that these technologies have been proven in other areas, at the end they are adopted as well, for different reasons such as costs. A new technology model is happening now based on some previous technologies (virtualization, distributing and utility computing, web and software services) that are offered in new ways and is called cloud computing. The organizations are migrating more services to the cloud; this will have impact in their internal complexity and in the reliability of the systems they are offering to the organization itself and their clients. Not always this added complexity and associated risks to their reliability are seen. As well, when two or more CI systems are interacting, the risks of one can affect the rest, sharing the risks. This work investigates the use of cloud computing by critical systems, and is focused in the dependencies and reliability of these systems. Some examples are presented together with the associated risks. A framework is introduced for analysing the dependability and resilience of a system that relies on cloud services and how to improve them. As part of the framework, the concepts of micro and macro dependability are introduced to explain the internal and external dependability on services supplied by an external cloud. A pharmacovigilance model system has been used for framework validation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

An on-line survey of experts was conducted to solicit their views on policy priorities in the area of information and communication technologies (ICT) in the Caribbean. The experts considered the goal to “promote teacher training in the use of ICTs in the classroom” to be the highest priority, followed by goals to “reduce the cost of broadband services” and “promote the use of ICT in emergency and disaster prevention, preparedness and response.” Goals in the areas of cybercrime, e-commerce, egovernment, universal service funds, consumer protection, and on-line privacy rounded out the top 10. Some of the lowest ranked goals were those related to coordinating the management of infrastructure changes. These included the switchover for digital terrestrial television (DTT) and digital FM radio, cloud computing for government ICT, the introduction of satellite-based internet services, and the installation of content distribution networks (CDNs). Initiatives aimed at using ICT to promote specific industries, or specific means of promoting the digital economy, tended toward the centre of the rankings. Thus, a general pattern emerged which elevated the importance of focusing on how ICT is integrated into the broader society, with economic issues a lower priority, and concerns about coordination on infrastructure issues lower still.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Internet has become a universal communication network tool. It has evolved from a platform that supports best-effort traffic to one that now carries different traffic types including those involving continuous media with quality of service (QoS) requirements. As more services are delivered over the Internet, we face increasing risk to their availability given that malicious attacks on those Internet services continue to increase. Several networks have witnessed denial of service (DoS) and distributed denial of service (DDoS) attacks over the past few years which have disrupted QoS of network services, thereby violating the Service Level Agreement (SLA) between the client and the Internet Service Provider (ISP). Hence DoS or DDoS attacks are major threats to network QoS. In this paper we survey techniques and solutions that have been deployed to thwart DoS and DDoS attacks and we evaluate them in terms of their impact on network QoS for Internet services. We also present vulnerabilities that can be exploited for QoS protocols and also affect QoS if exploited. In addition, we also highlight challenges that still need to be addressed to achieve end-to-end QoS with recently proposed DoS/DDoS solutions. © 2010 John Wiley & Sons, Ltd.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A compiled set of in situ data is important to evaluate the quality of ocean-colour satellite-data records. Here we describe the data compiled for the validation of the ocean-colour products from the ESA Ocean Colour Climate Change Initiative (OC-CCI). The data were acquired from several sources (MOBY, BOUSSOLE, AERONET-OC, SeaBASS, NOMAD, MERMAID, AMT, ICES, HOT, GeP&CO), span between 1997 and 2012, and have a global distribution. Observations of the following variables were compiled: spectral remote-sensing reflectances, concentrations of chlorophyll a, spectral inherent optical properties and spectral diffuse attenuation coefficients. The data were from multi-project archives acquired via the open internet services or from individual projects, acquired directly from data providers. Methodologies were implemented for homogenisation, quality control and merging of all data. No changes were made to the original data, other than averaging of observations that were close in time and space, elimination of some points after quality control and conversion to a standard format. The final result is a merged table designed for validation of satellite-derived ocean-colour products and available in text format. Metadata of each in situ measurement (original source, cruise or experiment, principal investigator) were preserved throughout the work and made available in the final table. Using all the data in a validation exercise increases the number of matchups and enhances the representativeness of different marine regimes. By making available the metadata, it is also possible to analyse each set of data separately.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Intrusion Detection Systems (IDSs) provide an important layer of security for computer systems and networks, and are becoming more and more necessary as reliance on Internet services increases and systems with sensitive data are more commonly open to Internet access. An IDS’s responsibility is to detect suspicious or unacceptable system and network activity and to alert a systems administrator to this activity. The majority of IDSs use a set of signatures that define what suspicious traffic is, and Snort is one popular and actively developing open-source IDS that uses such a set of signatures known as Snort rules. Our aim is to identify a way in which Snort could be developed further by generalising rules to identify novel attacks. In particular, we attempted to relax and vary the conditions and parameters of current Snort rules, using a similar approach to classic rule learning operators such as generalisation and specialisation. We demonstrate the effectiveness of our approach through experiments with standard datasets and show that we are able to detect previously undetected variants of various attacks. We conclude by discussing the general effectiveness and appropriateness of generalisation in Snort based IDS rule processing. Keywords: anomaly detection, intrusion detection, Snort, Snort rules

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Intrusion Detection Systems (IDSs) provide an important layer of security for computer systems and networks, and are becoming more and more necessary as reliance on Internet services increases and systems with sensitive data are more commonly open to Internet access. An IDS’s responsibility is to detect suspicious or unacceptable system and network activity and to alert a systems administrator to this activity. The majority of IDSs use a set of signatures that define what suspicious traffic is, and Snort is one popular and actively developing open-source IDS that uses such a set of signatures known as Snort rules. Our aim is to identify a way in which Snort could be developed further by generalising rules to identify novel attacks. In particular, we attempted to relax and vary the conditions and parameters of current Snort rules, using a similar approach to classic rule learning operators such as generalisation and specialisation. We demonstrate the effectiveness of our approach through experiments with standard datasets and show that we are able to detect previously undetected variants of various attacks. We conclude by discussing the general effectiveness and appropriateness of generalisation in Snort based IDS rule processing. Keywords: anomaly detection, intrusion detection, Snort, Snort rules

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A compiled set of in situ data is important to evaluate the quality of ocean-colour satellite-data records. Here we describe the data compiled for the validation of the ocean-colour products from the ESA Ocean Colour Climate Change Initiative (OC-CCI). The data were acquired from several sources (MOBY, BOUSSOLE, AERONET-OC, SeaBASS, NOMAD, MERMAID, AMT, ICES, HOT, GeP&CO), span between 1997 and 2012, and have a global distribution. Observations of the following variables were compiled: spectral remote-sensing reflectances, concentrations of chlorophyll a, spectral inherent optical properties and spectral diffuse attenuation coefficients. The data were from multi-project archives acquired via the open internet services or from individual projects, acquired directly from data providers. Methodologies were implemented for homogenisation, quality control and merging of all data. No changes were made to the original data, other than averaging of observations that were close in time and space, elimination of some points after quality control and conversion to a standard format. The final result is a merged table designed for validation of satellite-derived ocean-colour products and available in text format. Metadata of each in situ measurement (original source, cruise or experiment, principal investigator) were preserved throughout the work and made available in the final table. Using all the data in a validation exercise increases the number of matchups and enhances the representativeness of different marine regimes. By making available the metadata, it is also possible to analyse each set of data separately. The compiled data are available at doi: 10.1594/PANGAEA.854832 (Valente et al., 2015).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Internet and Web services have been used in both teaching and learning and are gaining popularity in today’s world. E-Learning is becoming popular and considered the latest advance in technology based learning. Despite the potential advantages for learning in a small country like Bhutan, there is lack of eServices at the Paro College of Education. This study investigated students’ attitudes towards online communities and frequency of access to the Internet, and how students locate and use different sources of information in their project tasks. Since improvement was at the heart of this research, an action research approach was used. Based on the idea of purposeful sampling, a semi-structured interview and observations were used as data collection instruments. 10 randomly selected students (5 girls and 5 boys) participated in this research as the controlled group. The study findings indicated that there is a lack of educational information technology services, such as e-learning at the college. Internet connection being very slow was the main barrier to learning using e-learning or accessing Internet resources. There is a strong relationship between the quality of written task and the source of the information, and between Web searching and learning. The source of information used in assignments and project work is limited to books in the library which are often outdated and of poor quality. Project tasks submitted by most of the students were of poor quality.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Service-oriented Architectures (SOA) and Web services leverage the technical value of solutions in the areas of distributed systems and cross-enterprise integration. The emergence of Internet marketplaces for business services is driving the need to describe services, not only from a technical level, but also from a business and operational perspective. While, SOA and Web services reside in an IT layer, organizations owing Internet marketplaces are requiring advertising and trading business services which reside in a business layer. As a result, the gap between business and IT needs to be closed. This paper presents USDL (Unified Service Description Language), a specification language to describe services from a business, operational and technical perspective. USDL plays a major role in the Internet of Services to describe tradable services which are advertised in electronic marketplaces. The language has been tested using two service marketplaces as use cases.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A prominent research focus, especially in the context of EU public funding, has been the systematic use of the Internet for new ways of value creation in the services sector. This idea of service networks in the Internet, frequently dubbed the Internet of Services or Web service ecosystems, wants to make services tradable in digital media. In order to enable communication and trade between providers and consumers of services, the Internet of Services requires a standard that creates a "commercial envelope" around a service. This is where the Unified Service Description Language (USDL) comes into play as a normative and balanced unification of service information. The unified description established by USDL is machine-processable, considers technical and business aspects of a service as well as functional and non-functional attributes.