804 resultados para Cloud Computing, Software-as-a-Service (SaaS), SaaS Multi-Tenant, Windows Azure


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents a cloud-based software platform for sharing publicly available scientific datasets. The proposed platform leverages the potential of NoSQL databases and asynchronous IO technologies, such as Node.JS, in order to achieve high performances and flexible solutions. This solution will serve two main groups of users. The dataset providers, which are the researchers responsible for sharing and maintaining datasets, and the dataset users, that are those who desire to access the public data. To the former are given tools to easily publish and maintain large volumes of data, whereas the later are given tools to enable the preview and creation of subsets of the original data through the introduction of filter and aggregation operations. The choice of NoSQL over more traditional RDDMS emerged from and extended benchmark between relational databases (MySQL) and NoSQL (MongoDB) that is also presented in this thesis. The obtained results come to confirm the theoretical guarantees that NoSQL databases are more suitable for the kind of data that our system users will be handling, i. e., non-homogeneous data structures that can grow really fast. It is envisioned that a platform like this can lead the way to a new era of scientific data sharing where researchers are able to easily share and access all kinds of datasets, and even in more advanced scenarios be presented with recommended datasets and already existing research results on top of those recommendations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives: To discuss how current research in the area of smart homes and ambient assisted living will be influenced by the use of big data. Methods: A scoping review of literature published in scientific journals and conference proceedings was performed, focusing on smart homes, ambient assisted living and big data over the years 2011-2014. Results: The health and social care market has lagged behind other markets when it comes to the introduction of innovative IT solutions and the market faces a number of challenges as the use of big data will increase. First, there is a need for a sustainable and trustful information chain where the needed information can be transferred from all producers to all consumers in a structured way. Second, there is a need for big data strategies and policies to manage the new situation where information is handled and transferred independently of the place of the expertise. Finally, there is a possibility to develop new and innovative business models for a market that supports cloud computing, social media, crowdsourcing etc. Conclusions: The interdisciplinary area of big data, smart homes and ambient assisted living is no longer only of interest for IT developers, it is also of interest for decision makers as customers make more informed choices among today's services. In the future it will be of importance to make information usable for managers and improve decision making, tailor smart home services based on big data, develop new business models, increase competition and identify policies to ensure privacy, security and liability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents a cloud-based software platform for sharing publicly available scientific datasets. The proposed platform leverages the potential of NoSQL databases and asynchronous IO technologies, such as Node.JS, in order to achieve high performances and flexible solutions. This solution will serve two main groups of users. The dataset providers, which are the researchers responsible for sharing and maintaining datasets, and the dataset users, that are those who desire to access the public data. To the former are given tools to easily publish and maintain large volumes of data, whereas the later are given tools to enable the preview and creation of subsets of the original data through the introduction of filter and aggregation operations. The choice of NoSQL over more traditional RDDMS emerged from and extended benchmark between relational databases (MySQL) and NoSQL (MongoDB) that is also presented in this thesis. The obtained results come to confirm the theoretical guarantees that NoSQL databases are more suitable for the kind of data that our system users will be handling, i. e., non-homogeneous data structures that can grow really fast. It is envisioned that a platform like this can lead the way to a new era of scientific data sharing where researchers are able to easily share and access all kinds of datasets, and even in more advanced scenarios be presented with recommended datasets and already existing research results on top of those recommendations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The 10th European Conference on Information Systems Management is being held at The University of Evora, Portugal on the 8 /9 September 2016. The Conference Chair is Paulo Silva and the Programme Chairs are Prof. Rui Quaresma and Prof. António Guerreiro. ECISM provides an opportunity for individuals researching and working in the broad field of information systems management, including IT evaluation to come together to exchange ideas and discuss current research in the field. This has developed into a particularly important forum for the present era, where the modern challenges of managing information and evaluating the effectiveness of related technologies are constantly evolving in the world of Big Data and Cloud Computing. We hope that this year’s conference will provide you with plenty of opportunities to share your expertise with colleagues from around the world. The keynote speakers for the Conference are Carlos Zorrinho from the Portuguese Delegation and Isabel Ramos from University of Minho, Portugal. ECISM 2016 received an initial submission of 84 abstracts. After the double blind peer review process 25 aca demic papers, 7 PhD research papers, 3 Masters research paper and 5 work in progress papers have been ac cepted for publication in these Conference Proceedings. These papers represent research from around the world, including Belgium, Brazil, China, Czech Republic, Kazakhstan, Malaysia, New Zealand, Norway, Oman, Poland, Portugal, South Africa, Sweden, The Netherlands, UK and Vietnam.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents the study and experimental tests for the viability analysis of using multiple wireless technologies in urban traffic light controllers in a Smart City environment. Communication drivers, different types of antennas, data acquisition methods and data processing for monitoring the network are presented. The sensors and actuators modules are connected in a local area network through two distinct low power wireless networks using both 868 MHz and 2.4 GHz frequency bands. All data communications using 868 MHz go through a Moteino. Various tests are made to assess the most advantageous features of each communication type. The experimental results show better range for 868 MHz solutions, whereas the 2.4 GHz presents the advantage of self-regenerating the network and mesh. The different pros and cons of both communication methods are presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this school, we introduced the basis of the mathematical analysis to study differential equations (ordinary and partial). One aim to prepare students and staff members for more concrete problems arising in mathematical modeling in engineering and biological processes. Theoretical and numerical lectures were given, with a presentation of free scientific computing software using Python. A website and a drive were created to facilitate exchanges between students, lecturers and organizers: The school was sponsored by CIMPA (Centre International de Mathématiques Pures et Appliquées) IMU (International Mathematical Union) ISP (International Science Programme) NUOL (National University of Laos) Erasmus Mundus ERDF – Picardie Region Beerlao ETL3G

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The modern industrial environment is populated by a myriad of intelligent devices that collaborate for the accomplishment of the numerous business processes in place at the production sites. The close collaboration between humans and work machines poses new interesting challenges that industry must overcome in order to implement the new digital policies demanded by the industrial transition. The Industry 5.0 movement is a companion revolution of the previous Industry 4.0, and it relies on three characteristics that any industrial sector should have and pursue: human centrality, resilience, and sustainability. The application of the fifth industrial revolution cannot be completed without moving from the implementation of Industry 4.0-enabled platforms. The common feature found in the development of this kind of platform is the need to integrate the Information and Operational layers. Our thesis work focuses on the implementation of a platform addressing all the digitization features foreseen by the fourth industrial revolution, making the IT/OT convergence inside production plants an improvement and not a risk. Furthermore, we added modular features to our platform enabling the Industry 5.0 vision. We favored the human centrality using the mobile crowdsensing techniques and the reliability and sustainability using pluggable cloud computing services, combined with data coming from the crowd support. We achieved important and encouraging results in all the domains in which we conducted our experiments. Our IT/OT convergence-enabled platform exhibits the right performance needed to satisfy the strict requirements of production sites. The multi-layer capability of the framework enables the exploitation of data not strictly coming from work machines, allowing a more strict interaction between the company, its employees, and customers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Time Series Analysis of multispectral satellite data offers an innovative way to extract valuable information of our changing planet. This is now a real option for scientists thanks to data availability as well as innovative cloud-computing platforms, such as Google Earth Engine. The integration of different missions would mitigate known issues in multispectral time series construction, such as gaps due to clouds or other atmospheric effects. With this purpose, harmonization among Landsat-like missions is possible through statistical analysis. This research offers an overview of the different instruments from Landsat and Sentinel missions (TM, ETM, OLI, OLI-2 and MSI sensors) and products levels (Collection-2 Level-1 and Surface Reflectance for Landsat and Level-1C and Level-2A for Sentinel-2). Moreover, a cross-sensors comparison was performed to assess the interoperability of the sensors on-board Landsat and Sentinel-2 constellations, having in mind a possible combined use for time series analysis. Firstly, more than 20,000 pairs of images almost simultaneously acquired all over Europe were selected over a period of several years. The study performed a cross-comparison analysis on these data, and provided an assessment of the calibration coefficients that can be used to minimize differences in the combined use. Four of the most popular vegetation indexes were selected for the study: NDVI, EVI, SAVI and NDMI. As a result, it is possible to reconstruct a longer and denser harmonized time series since 1984, useful for vegetation monitoring purposes. Secondly, the spectral characteristics of the recent Landsat-9 mission were assessed for a combined use with Landsat-8 and Sentinel-2. A cross-sensor analysis of common bands of more than 3,000 almost simultaneous acquisitions verified a high consistency between datasets. The most relevant discrepancy has been observed in the blue and SWIRS bands, often used in vegetation and water related studies. This analysis was supported with spectroradiometer ground measurements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Negli ultimi anni la necessità di processare e mantenere dati di qualsiasi natura è aumentata considerevolmente, in aggiunta a questo, l’obsolescenza del modello centralizzato ha contribuito alla sempre più frequente adozione del modello distribuito. Inevitabile dunque l’aumento di traffico che attraversa i nodi appartenenti alle infrastrutture, un traffico sempre più in aumento e che con l’avvento dell’IoT, dei Big Data, del Cloud Computing, del Serverless Computing etc., ha raggiunto picchi elevatissimi. Basti pensare che se prima i dati erano contenuti in loco, oggi non è assurdo pensare che l’archiviazione dei propri dati sia completamente affidata a terzi. Così come cresce, quindi, il traffico che attraversa i nodi facenti parte di un’infrastruttura, cresce la necessità che questo traffico sia filtrato e gestito dai nodi stessi. L’obbiettivo di questa tesi è quello di estendere un Message-oriented Middleware, in grado di garantire diverse qualità di servizio per la consegna di messaggi, in modo da accelerarne la fase di routing verso i nodi destinazione. L’estensione consiste nell’aggiungere al Message-oriented Middleware, precedentemente implementato, la funzione di intercettare i pacchetti in arrivo (che nel caso del middleware in questione possono rappresentare la propagazione di eventi) e redirigerli verso un nuovo nodo in base ad alcuni parametri. Il Message-oriented Middleware oggetto di tesi sarà considerato il message broker di un modello pub/sub, pertanto la redirezione deve avvenire con tempi molto bassi di latenza e, a tal proposito, deve avvenire senza l’uscita dal kernel space del sistema operativo. Per questo motivo si è deciso di utilizzare eBPF, in particolare il modulo XDP, che permette di scrivere programmi che eseguono all’interno del kernel.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Il lavoro di tesi proposto è volto allo studio delle architetture Serverless, strutture che permettono agli sviluppatori di comporre facilmente applicazioni distribuite su molti servizi all’interno di un cloud, senza la necessità di gestire il server. Più nello specifico sono state studiate le FaaS (Function-as-a-Service), un modello di elaborazione cloud basato su eventi in cui il codice viene distribuito in container gestiti dalla piattaforma, e successivamente eseguito on-demand. A seguito di una prima parte di ricerca dello stato dell'arte, l'attenzione si è spostata sulla ricerca di vulnerabilità nel contesto del servizio OpenFaaS, un framework open-source che permette agli sviluppatori di distribuire facilmente funzioni e microservizi. Il deployment è stato fatto faasd, quest’ultimo è una semplificazione di OpenFaaS, usa le stesse componenti e lo stesso ecosistema di OpenFaaS ma usa Containerd al posto di Kubernetes. Dopo una prima fase di installazione e log-in il lavoro si è concentrato nelle varie metodologie di penetration test, nonché di ricerca delle vulnerabilità di sicurezza associate a tale paradigma. In informatica, il penetration test è il processo operativo di analisi o valutazione della sicurezza di un sistema o di una rete, simulando l'attacco di un potenziale utente malintenzionato. Nell'ultima fase sono stati condotti vari tentativi di attacco al sistema OpenFaaS tramite l'ausilio di alcuni tool. Inizialmente è stata fatta un'analisi della rete e del traffico tramite tool quali NMAP e Wireshark, per comprenderne meglio la struttura e come faasd creasse le funzioni a partire dai container. Infine, tramite OpenFaaS sono state create delle funzioni per testare la sicurezza e l'affidabilità di quest'ultima. In particolare, le funzioni indagano all'interno dei container al fine di comprendere la possibilità di eseguire code injection e rilevare possibili dati sensibili nel filesystem dell'immagine Docker così come nelle variabili d'ambiente.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Industry 4.0 refers to the 4th industrial revolution and at its bases, we can see the digitalization and the automation of the assembly line. The whole production process has improved and evolved thanks to the advances made in networking, and AI studies, which include of course machine learning, cloud computing, IoT, and other technologies that are finally being implemented into the industrial scenario. All these technologies have in common a need for faster, more secure, robust, and reliable communication. One of the many solutions for these demands is the use of mobile communication technologies in the industrial environment, but which technology is better suited for these demands? Of course, the answer isn’t as simple as it seems. The 4th industrial revolution has a never seen incomparable potential with respect to the previous ones, every factory, enterprise, or company have different network demands, and even in each of these infrastructures, the demands may diversify by sector, or by application. For example, in the health care industry, there may be e a need for increased bandwidth for the analysis of high-definition videos or, faster speeds in order to have analytics occur in real-time, and again another application might be higher security and reliability to protect patients’ data. As seen above, choosing the right technology for the right environment and application, considers many things, and the ones just stated are but a speck of dust with respect to the overall picture. In this thesis, we will investigate a comparison between the use of two of the available technologies in use for the industrial environment: Wi-Fi 6 and 5G Private Networks in the specific case of a steel factory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The usage of Optical Character Recognition’s (OCR, systems is a widely spread technology into the world of Computer Vision and Machine Learning. It is a topic that interest many field, for example the automotive, where becomes a specialized task known as License Plate Recognition, useful for many application from the automation of toll road to intelligent payments. However, OCR systems need to be very accurate and generalizable in order to be able to extract the text of license plates under high variable conditions, from the type of camera used for acquisition to light changes. Such variables compromise the quality of digitalized real scenes causing the presence of noise and degradation of various type, which can be minimized with the application of modern approaches for image iper resolution and noise reduction. Oneclass of them is known as Generative Neural Networks, which are very strong ally for the solution of this popular problem.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A nonlinear finite element model was developed to simulate the nonlinear response of three-leaf masonry specimens, which were subjected to laboratory tests with the aim of investigating the mechanical behaviour of multiple-leaf stone masonry walls up to failure. The specimens consisted of two external leaves made of stone bricks and mortar joints, and an internal leaf in mortar and stone aggregate. Different loading conditions, typologies of the collar joints, and stone types were taken into account. The constitutive law implemented in the model is characterized by a damage tensor, which allows the damage-induced anisotropy accompanying the cracking process to be described. To follow the post-peak behaviour of the specimens with sufficient accuracy it was necessary to make the damage model non-local, to avoid mesh-dependency effects related to the strain-softening behaviour of the material. Comparisons between the predicted and measured failure loads are quite satisfactory in most of the studied cases. (c) 2007 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work aims at proposing the use of the evolutionary computation methodology in order to jointly solve the multiuser channel estimation (MuChE) and detection problems at its maximum-likelihood, both related to the direct sequence code division multiple access (DS/CDMA). The effectiveness of the proposed heuristic approach is proven by comparing performance and complexity merit figures with that obtained by traditional methods found in literature. Simulation results considering genetic algorithm (GA) applied to multipath, DS/CDMA and MuChE and multi-user detection (MuD) show that the proposed genetic algorithm multi-user channel estimation (GAMuChE) yields a normalized mean square error estimation (nMSE) inferior to 11%, under slowly varying multipath fading channels, large range of Doppler frequencies and medium system load, it exhibits lower complexity when compared to both maximum likelihood multi-user channel estimation (MLMuChE) and gradient descent method (GrdDsc). A near-optimum multi-user detector (MuD) based on the genetic algorithm (GAMuD), also proposed in this work, provides a significant reduction in the computational complexity when compared to the optimum multi-user detector (OMuD). In addition, the complexity of the GAMuChE and GAMuD algorithms were (jointly) analyzed in terms of number of operations necessary to reach the convergence, and compared to other jointly MuChE and MuD strategies. The joint GAMuChE-GAMuD scheme can be regarded as a promising alternative for implementing third-generation (3G) and fourth-generation (4G) wireless systems in the near future. Copyright (C) 2010 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Metaheuristics performance is highly dependent of the respective parameters which need to be tuned. Parameter tuning may allow a larger flexibility and robustness but requires a careful initialization. The process of defining which parameters setting should be used is not obvious. The values for parameters depend mainly on the problem, the instance to be solved, the search time available to spend in solving the problem, and the required quality of solution. This paper presents a learning module proposal for an autonomous parameterization of Metaheuristics, integrated on a Multi-Agent System for the resolution of Dynamic Scheduling problems. The proposed learning module is inspired on Autonomic Computing Self-Optimization concept, defining that systems must continuously and proactively improve their performance. For the learning implementation it is used Case-based Reasoning, which uses previous similar data to solve new cases. In the use of Case-based Reasoning it is assumed that similar cases have similar solutions. After a literature review on topics used, both AutoDynAgents system and Self-Optimization module are described. Finally, a computational study is presented where the proposed module is evaluated, obtained results are compared with previous ones, some conclusions are reached, and some future work is referred. It is expected that this proposal can be a great contribution for the self-parameterization of Metaheuristics and for the resolution of scheduling problems on dynamic environments.