802 resultados para H54 - Infrastructures
Resumo:
Massive Internet of Things is expected to play a crucial role in Beyond 5G (B5G) wireless communication systems, offering seamless connectivity among heterogeneous devices without human intervention. However, the exponential proliferation of smart devices and IoT networks, relying solely on terrestrial networks, may not fully meet the demanding IoT requirements in terms of bandwidth and connectivity, especially in areas where terrestrial infrastructures are not economically viable. To unleash the full potential of 5G and B5G networks and enable seamless connectivity everywhere, the 3GPP envisions the integration of Non-Terrestrial Networks (NTNs) into the terrestrial ones starting from Release 17. However, this integration process requires modifications to the 5G standard to ensure reliable communications despite typical satellite channel impairments. In this framework, this thesis aims at proposing techniques at the Physical and Medium Access Control layers that require minimal adaptations in the current NB-IoT standard via NTN. Thus, firstly the satellite impairments are evaluated and, then, a detailed link budget analysis is provided. Following, analyses at the link and the system levels are conducted. In the former case, a novel algorithm leveraging time-frequency analysis is proposed to detect orthogonal preambles and estimate the signals’ arrival time. Besides, the effects of collisions on the detection probability and Bit Error Rate are investigated and Non-Orthogonal Multiple Access approaches are proposed in the random access and data phases. The system analysis evaluates the performance of random access in case of congestion. Various access parameters are tested in different satellite scenarios, and the performance is measured in terms of access probability and time required to complete the procedure. Finally, a heuristic algorithm is proposed to jointly design the access and data phases, determining the number of satellite passages, the Random Access Periodicity, and the number of uplink repetitions that maximize the system's spectral efficiency.
Resumo:
The pervasive availability of connected devices in any industrial and societal sector is pushing for an evolution of the well-established cloud computing model. The emerging paradigm of the cloud continuum embraces this decentralization trend and envisions virtualized computing resources physically located between traditional datacenters and data sources. By totally or partially executing closer to the network edge, applications can have quicker reactions to events, thus enabling advanced forms of automation and intelligence. However, these applications also induce new data-intensive workloads with low-latency constraints that require the adoption of specialized resources, such as high-performance communication options (e.g., RDMA, DPDK, XDP, etc.). Unfortunately, cloud providers still struggle to integrate these options into their infrastructures. That risks undermining the principle of generality that underlies the cloud computing scale economy by forcing developers to tailor their code to low-level APIs, non-standard programming models, and static execution environments. This thesis proposes a novel system architecture to empower cloud platforms across the whole cloud continuum with Network Acceleration as a Service (NAaaS). To provide commodity yet efficient access to acceleration, this architecture defines a layer of agnostic high-performance I/O APIs, exposed to applications and clearly separated from the heterogeneous protocols, interfaces, and hardware devices that implement it. A novel system component embodies this decoupling by offering a set of agnostic OS features to applications: memory management for zero-copy transfers, asynchronous I/O processing, and efficient packet scheduling. This thesis also explores the design space of the possible implementations of this architecture by proposing two reference middleware systems and by adopting them to support interactive use cases in the cloud continuum: a serverless platform and an Industry 4.0 scenario. A detailed discussion and a thorough performance evaluation demonstrate that the proposed architecture is suitable to enable the easy-to-use, flexible integration of modern network acceleration into next-generation cloud platforms.
Resumo:
The aim of this thesis is to use the developments, advantages and applications of "Building Information Modelling" (BIM) with emphasis on the discipline of structural design for steel building located in Perugia. BIM was mainly considered as a new way of planning, constructing and operating buildings or infrastructures. It has been found to offer greater opportunities for increased efficiency, optimization of resources and generally better management throughout the life cycle of a facility. BIM increases the digitalization of processes and offers integrated and collaborative technologies for design, construction and operation. To understand BIM and its benefits, one must consider all phases of a project. Higher initial design costs often lead to lower construction and operation costs. Creating data-rich digital models helps to better predict and coordinate the construction phases and operation of a building. One of the main limitations identified in the implementation of BIM is the lack of knowledge and qualified professionals. Certain disciplines such as structural and mechanical design depend on whether the main contractor, owner, general contractor or architect need to use or apply BIM to their projects. The existence of a supporting or mandatory BIM guideline may then eventually lead to its adoption. To test the potential of the BIM adoption in the steel design process, some models were developed taking advantage of a largely diffuse authoring software (Autodesk Revit), to produce construction drawings and also material schedule that were needed in order to estimate quantities and features of a real steel building. Once the model has been built the whole process has been analyzed and then compared with the traditional design process of steel structure. Many relevant aspect in term of clearness and also in time spent were shown and lead to final conclusions about the benefits from BIM methodology.
Resumo:
Today more than ever, with the recent war in Ukraine and the increasing number of attacks that affect systems of nations and companies every day, the world realizes that cybersecurity can no longer be considered just as a “cost”. It must become a pillar for our infrastructures that involve the security of our nations and the safety of people. Critical infrastructure, like energy, financial services, and healthcare, have become targets of many cyberattacks from several criminal groups, with an increasing number of resources and competencies, putting at risk the security and safety of companies and entire nations. This thesis aims to investigate the state-of-the-art regarding the best practice for securing Industrial control systems. We study the differences between two security frameworks. The first is Industrial Demilitarized Zone (I-DMZ), a perimeter-based security solution. The second one is the Zero Trust Architecture (ZTA) which removes the concept of perimeter to offer an entirely new approach to cybersecurity based on the slogan ‘Never Trust, always verify’. Starting from this premise, the Zero Trust model embeds strict Authentication, Authorization, and monitoring controls for any access to any resource. We have defined two architectures according to the State-of-the-art and the cybersecurity experts’ guidelines to compare I-DMZ, and Zero Trust approaches to ICS security. The goal is to demonstrate how a Zero Trust approach dramatically reduces the possibility of an attacker penetrating the network or moving laterally to compromise the entire infrastructure. A third architecture has been defined based on Cloud and fog/edge computing technology. It shows how Cloud solutions can improve the security and reliability of infrastructure and production processes that can benefit from a range of new functionalities, that the Cloud could offer as-a-Service.We have implemented and tested our Zero Trust solution and its ability to block intrusion or attempted attacks.
Resumo:
Industry 4.0 refers to the 4th industrial revolution and at its bases, we can see the digitalization and the automation of the assembly line. The whole production process has improved and evolved thanks to the advances made in networking, and AI studies, which include of course machine learning, cloud computing, IoT, and other technologies that are finally being implemented into the industrial scenario. All these technologies have in common a need for faster, more secure, robust, and reliable communication. One of the many solutions for these demands is the use of mobile communication technologies in the industrial environment, but which technology is better suited for these demands? Of course, the answer isn’t as simple as it seems. The 4th industrial revolution has a never seen incomparable potential with respect to the previous ones, every factory, enterprise, or company have different network demands, and even in each of these infrastructures, the demands may diversify by sector, or by application. For example, in the health care industry, there may be e a need for increased bandwidth for the analysis of high-definition videos or, faster speeds in order to have analytics occur in real-time, and again another application might be higher security and reliability to protect patients’ data. As seen above, choosing the right technology for the right environment and application, considers many things, and the ones just stated are but a speck of dust with respect to the overall picture. In this thesis, we will investigate a comparison between the use of two of the available technologies in use for the industrial environment: Wi-Fi 6 and 5G Private Networks in the specific case of a steel factory.
Resumo:
There are many natural events that can negatively affect the urban ecosystem, but weather-climate variations are certainly among the most significant. The history of settlements has been characterized by extreme events like earthquakes and floods, which repeat themselves at different times, causing extensive damage to the built heritage on a structural and urban scale. Changes in climate also alter various climatic subsystems, changing rainfall regimes and hydrological cycles, increasing the frequency and intensity of extreme precipitation events (heavy rainfall). From an hydrological risk perspective, it is crucial to understand future events that could occur and their magnitude in order to design safer infrastructures. Unfortunately, it is not easy to understand future scenarios as the complexity of climate is enormous. For this thesis, precipitation and discharge extremes were primarily used as data sources. It is important to underline that the two data sets are not separated: changes in rainfall regime, due to climate change, could significantly affect overflows into receiving water bodies. It is imperative that we understand and model climate change effects on water structures to support the development of adaptation strategies. The main purpose of this thesis is to search for suitable water structures for a road located along the Tione River. Therefore, through the analysis of the area from a hydrological point of view, we aim to guarantee the safety of the infrastructure over time. The observations made have the purpose to underline how models such as a stochastic one can improve the quality of an analysis for design purposes, and influence choices.
Resumo:
Since the majority of the population of the world lives in cities and that this number is expected to increase in the next years, one of the biggest challenges of the research is the determination of the risk deriving from high temperatures experienced in urban areas, together with improving responses to climate-related disasters, for example by introducing in the urban context vegetation or built infrastructures that can improve the air quality. In this work, we will investigate how different setups of the boundary and initial conditions set on an urban canyon generate different patterns of the dispersion of a pollutant. To do so we will exploit the low computational cost of Reynolds-Averaged Navier-Stokes (RANS) simulations to reproduce the dynamics of an infinite array of two-dimensional square urban canyons. A pollutant is released at the street level to mimic the presence of traffic. RANS simulations are run using the k-ɛ closure model and vertical profiles of significant variables of the urban canyon, namely the velocity, the turbulent kinetic energy, and the concentration, are represented. This is done using the open-source software OpenFOAM and modifying the standard solver simpleFoam to include the concentration equation and the temperature by introducing a buoyancy term in the governing equations. The results of the simulation are validated with experimental results and products of Large-Eddy Simulations (LES) from previous works showing that the simulation is able to reproduce all the quantities under examination with satisfactory accuracy. Moreover, this comparison shows that despite LES are known to be more accurate albeit more expensive, RANS simulations represent a reliable tool if a smaller computational cost is needed. Overall, this work exploits the low computational cost of RANS simulations to produce multiple scenarios useful to evaluate how the dispersion of a pollutant changes by a modification of key variables, such as the temperature.