896 resultados para computation- and data-intensive applications


Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research activities involved the application of the Geomatic techniques in the Cultural Heritage field, following the development of two themes: Firstly, the application of high precision surveying techniques for the restoration and interpretation of relevant monuments and archaeological finds. The main case regards the activities for the generation of a high-fidelity 3D model of the Fountain of Neptune in Bologna. In this work, aimed to the restoration of the manufacture, both the geometrical and radiometrical aspects were crucial. The final product was the base of a 3D information system representing a shared tool where the different figures involved in the restoration activities shared their contribution in a multidisciplinary approach. Secondly, the arrangement of 3D databases for a Building Information Modeling (BIM) approach, in a process which involves the generation and management of digital representations of physical and functional characteristics of historical buildings, towards a so-called Historical Building Information Model (HBIM). A first application was conducted for the San Michele in Acerboli’s church in Santarcangelo di Romagna. The survey was performed by the integration of the classical and modern Geomatic techniques and the point cloud representing the church was used for the development of a HBIM model, where the relevant information connected to the building could be stored and georeferenced. A second application regards the domus of Obellio Firmo in Pompeii, surveyed by the integration of the classical and modern Geomatic techniques. An historical analysis permitted the definitions of phases and the organization of a database of materials and constructive elements. The goal is the obtaining of a federate model able to manage the different aspects: documental, analytic and reconstructive ones.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents the study of small nitrogen-bearing molecules, from diatomic radicals to complex organic molecules, by means of rotational and ro-vibrational spectroscopy. Besides their theoretical relevance, which spans from anharmonic force field analyses to energetic and structural properties, I have chosen this family of species because of their astrochemical importance. After some basic knowledge of molecular spectroscopy and astrochemistry is introduced, the instrumentation used during the course of my PhD school is described. Then, the most relevant studies I conducted during the last three years are presented. Generally speaking, a number of molecules of astrophysical relevance have been characterized by means of rotational and ro-vibrational spectroscopy. The sample of studied species is constituted by small radicals (imidogen, amidogen, and titanium nitride), cyanopolyynes (cyanoacetylene) and pre-biotic molecules (aminoacetonitrile): these studies are presented in great detail. Among the results, the first astronomical detection of two deuterated radicals (NHD and ND2) is presented in this thesis.Thanks to our studies, it was possible to clearly identify molecular absorptions of these species towards the pre-stellar core IRAS16293-2422, as recorded by the Herschel Space Observatory mission. These observations confirm the strong deuterium enhancement generally observed in this cloud but they reveal that models underestimate the abundances of NHD and ND2. I also report the detection of vibrationally excited aminoacetonitrile (NH2CH2CN) in Sagittarius B2, as observed in the ReMoCa survey. This is the second detection of aminoacetonitrile in the interstellar medium and the first astronomical observation of its vibrationally hot lines. This represents a small step toward the comprehension on how complex organic molecules are formed and which processes can lead to the formation of glycine. Finally, few general remarks are discussed and the importance of future laboratory studies is pointed out, along with possible perspectives.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Early definitions of Smart Building focused almost entirely on the technology aspect and did not suggest user interaction at all. Indeed, today we would attribute it more to the concept of the automated building. In this sense, control of comfort conditions inside buildings is a problem that is being well investigated, since it has a direct effect on users’ productivity and an indirect effect on energy saving. Therefore, from the users’ perspective, a typical environment can be considered comfortable, if it’s capable of providing adequate thermal comfort, visual comfort and indoor air quality conditions and acoustic comfort. In the last years, the scientific community has dealt with many challenges, especially from a technological point of view. For instance, smart sensing devices, the internet, and communication technologies have enabled a new paradigm called Edge computing that brings computation and data storage closer to the location where it is needed, to improve response times and save bandwidth. This has allowed us to improve services, sustainability and decision making. Many solutions have been implemented such as smart classrooms, controlling the thermal condition of the building, monitoring HVAC data for energy-efficient of the campus and so forth. Though these projects provide to the realization of smart campus, a framework for smart campus is yet to be determined. These new technologies have also introduced new research challenges: within this thesis work, some of the principal open challenges will be faced, proposing a new conceptual framework, technologies and tools to move forward the actual implementation of smart campuses. Keeping in mind, several problems known in the literature have been investigated: the occupancy detection, noise monitoring for acoustic comfort, context awareness inside the building, wayfinding indoor, strategic deployment for air quality and books preserving.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Monolithic materials cannot always satisfy the demands of today’s advanced requirements. Only by combining several materials at different length-scales, as nature does, the requested performances can be met. Polymer nanocomposites are intended to overcome the common drawbacks of pristine polymers, with a multidisciplinary collaboration of material science with chemistry, engineering, and nanotechnology. These materials are an active combination of polymers and nanomaterials, where at least one phase lies in the nanometer range. By mimicking nature’s materials is possible to develop new nanocomposites for structural applications demanding combinations of strength and toughness. In this perspective, nanofibers obtained by electrospinning have been increasingly adopted in the last decade to improve the fracture toughness of Fiber Reinforced Plastic (FRP) laminates. Although nanofibers have already found applications in various fields, their widespread introduction in the industrial context is still a long way to go. This thesis aims to develop methodologies and models able to predict the behaviour of nanofibrous-reinforced polymers, paving the way for their practical engineering applications. It consists of two main parts. The first one investigates the mechanisms that act at the nanoscale, systematically evaluating the mechanical properties of both the nanofibrous reinforcement phase (Chapter 1) and hosting polymeric matrix (Chapter 2). The second part deals with the implementation of different types of nanofibers for novel pioneering applications, trying to combine the well-known fracture toughness enhancement in composite laminates with improving other mechanical properties or including novel functionalities. Chapter 3 reports the development of novel adhesive carriers made of nylon 6,6 nanofibrous mats to increase the fracture toughness of epoxy-bonded joints. In Chapter 4, recently developed rubbery nanofibers are used to enhance the damping properties of unidirectional carbon fiber laminates. Lastly, in Chapter 5, a novel self-sensing composite laminate capable of detecting impacts on its surface using PVDF-TrFE piezoelectric nanofibers is presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hadrontherapy employs high-energy beams of charged particles (protons and heavier ions) to treat deep-seated tumours: these particles have a favourable depth-dose distribution in tissue characterized by a low dose in the entrance channel and a sharp maximum (Bragg peak) near the end of their path. In these treatments nuclear interactions have to be considered: beam particles can fragment in the human body releasing a non-zero dose beyond the Bragg peak while fragments of human body nuclei can modify the dose released in healthy tissues. These effects are still in question given the lack of interesting cross sections data. Also space radioprotection can profit by fragmentation cross section measurements: the interest in long-term manned space missions beyond Low Earth Orbit is growing in these years but it has to cope with major health risks due to space radiation. To this end, risk models are under study: however, huge gaps in fragmentation cross sections data are currently present preventing an accurate benchmark of deterministic and Monte Carlo codes. To fill these gaps in data, the FOOT (FragmentatiOn Of Target) experiment was proposed. It is composed by two independent and complementary setups, an Emulsion Cloud Chamber and an electronic setup composed by several subdetectors providing redundant measurements of kinematic properties of fragments produced in nuclear interactions between a beam and a target. FOOT aims to measure double differential cross sections both in angle and kinetic energy which is the most complete information to address existing questions. In this Ph.D. thesis, the development of the Trigger and Data Acquisition system for the FOOT electronic setup and a first analysis of 400 MeV/u 16O beam on Carbon target data acquired in July 2021 at GSI (Darmstadt, Germany) are presented. When possible, a comparison with other available measurements is also reported.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Biomarkers are biological indicators of human health conditions. Their ultra-sensitive quantification is of paramount importance in clinical monitoring and early disease diagnosis. Biosensors are simple and easy-to-use analytical devices and, in their world, electrochemiluminescence (ECL) is one of the most promising analytical techniques that needs an ever-increasing sensitivity for improving its clinical effectiveness. Scope of this project was the investigation of the ECL generation mechanisms for enhancing the ECL intensity also through the identification of suitable nanostructures. The combination of nanotechnologies, microscopy and ECL has proved to be a very successful strategy to improve the analytical efficiency of ECL in one of its most promising bioanalytical approaches, the bead-based immunoassay. Nanosystems, such as [Ru(bpy)3]2+-dye-doped nanoparticles (DDSNPs) and Bodipy Carbon Nanodots, have been used to improve the sensitivity of ECL techniques thanks to their advantageous and tuneable properties, reaching a signal increase of 750% in DDSNPs-bead-based immunoassay system. In this thesis, an investigation of size and distance effects on the ECL mechanisms was carried out through the innovative combination of ECL microscopy and electrochemical mapping of radicals. It allowed the discovery of an unexpected and highly efficient mechanistic path for ECL generation at small distances from the electrode surface. It was exploited and enhanced through the addition of a branched amine DPIBA to the usual coreactant TPrA solution for enhancing the ECL efficiency until a maximum of 128%. Finally, a beads-based immunoassay and an immunosensor specific for cardiac Troponin I were built exploiting previous results and carbon nanotubes features. They created a conductive layer around beads enhancing the signal by 70% and activating an ECL mechanism unobserved before in such systems. In conclusion, the combination of ECL microscopy and nanotechnology and the deep understanding of the mechanisms responsible for the ECL emission led to a great enhancement in the signal.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Effective field theories (EFTs) are ubiquitous in theoretical physics and in particular in field theory descriptions of quantum systems probed at energies much lower than one or few characterizing scales. More recently, EFTs have gained a prominent role in the study of fundamental interactions and in particular in the parametriasation of new physics beyond the Standard Model, which would occur at scales Λ, much larger than the electroweak scale. In this thesis, EFTs are employed to study three different physics cases. First, we consider light-by-light scattering as a possible probe of new physics. At low energies it can be described by dimension-8 operators, leading to the well-known Euler-Heisenberg Lagrangian. We consider the explicit dependence of matching coefficients on type of particle running in the loop, confirming the sensitiveness to the spin, mass, and interactions of possibly new particles. Second, we consider EFTs to describe Dark Matter (DM) interactions with SM particles. We consider a phenomenologically motivated case, i.e., a new fermion state that couples to the Hypercharge through a form factor and has no interactions with photons and the Z boson. Results from direct, indirect and collider searches for DM are used to constrain the parameter space of the model. Third, we consider EFTs that describe axion-like particles (ALPs), whose phenomenology is inspired by the Peccei-Quinn solution to strong CP problem. ALPs generically couple to ordinary matter through dimension-5 operators. In our case study, we investigate the rather unique phenomenological implications of ALPs with enhanced couplings to the top quark.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bone disorders have severe impact on body functions and quality life, and no satisfying therapies exist yet. The current models for bone disease study are scarcely predictive and the options existing for therapy fail for complex systems. To mimic and/or restore bone, 3D printing/bioprinting allows the creation of 3D structures with different materials compositions, properties, and designs. In this study, 3D printing/bioprinting has been explored for (i) 3D in vitro tumor models and (ii) regenerative medicine. Tumor models have been developed by investigating different bioinks (i.e., alginate, modified gelatin) enriched by hydroxyapatite nanoparticles to increase printing fidelity and increase biomimicry level, thus mimicking the organic and inorganic phase of bone. High Saos-2 cell viability was obtained, and the promotion of spheroids clusters as occurring in vivo was observed. To develop new syntethic bone grafts, two approaches have been explored. In the first, novel magnesium-phosphate scaffolds have been investigated by extrusion-based 3D printing for spinal fusion. 3D printing process and parameters have been optimized to obtain custom-shaped structures, with competent mechanical properties. The 3D printed structures have been combined to alginate porous structures created by a novel ice-templating technique, to be loaded by antibiotic drug to address infection prevention. Promising results in terms of planktonic growth inhibition was obtained. In the second strategy, marine waste precursors have been considered for the conversion in biogenic HA by using a mild-wet conversion method with different parameters. The HA/carbonate ratio conversion efficacy was analysed for each precursor (by FTIR and SEM), and the best conditions were combined to alginate to develop a composite structure. The composite paste was successfully employed in custom-modified 3D printer for the obtainment of 3D printed stable scaffolds. In conclusion, the osteomimetic materials developed in this study for bone models and synthetic grafts are promising in bone field.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis represents the conclusive outcome of the European Joint Doctorate programmein Law, Science & Technology funded by the European Commission with the instrument Marie Skłodowska-Curie Innovative Training Networks actions inside of the H2020, grantagreement n. 814177. The tension between data protection and privacy from one side, and the need of granting further uses of processed personal datails is investigated, drawing the lines of the technological development of the de-anonymization/re-identification risk with an explorative survey. After acknowledging its span, it is questioned whether a certain degree of anonymity can still be granted focusing on a double perspective: an objective and a subjective perspective. The objective perspective focuses on the data processing models per se, while the subjective perspective investigates whether the distribution of roles and responsibilities among stakeholders can ensure data anonymity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The discovery of new materials and their functions has always been a fundamental component of technological progress. Nowadays, the quest for new materials is stronger than ever: sustainability, medicine, robotics and electronics are all key assets which depend on the ability to create specifically tailored materials. However, designing materials with desired properties is a difficult task, and the complexity of the discipline makes it difficult to identify general criteria. While scientists developed a set of best practices (often based on experience and expertise), this is still a trial-and-error process. This becomes even more complex when dealing with advanced functional materials. Their properties depend on structural and morphological features, which in turn depend on fabrication procedures and environment, and subtle alterations leads to dramatically different results. Because of this, materials modeling and design is one of the most prolific research fields. Many techniques and instruments are continuously developed to enable new possibilities, both in the experimental and computational realms. Scientists strive to enforce cutting-edge technologies in order to make progress. However, the field is strongly affected by unorganized file management, proliferation of custom data formats and storage procedures, both in experimental and computational research. Results are difficult to find, interpret and re-use, and a huge amount of time is spent interpreting and re-organizing data. This also strongly limit the application of data-driven and machine learning techniques. This work introduces possible solutions to the problems described above. Specifically, it talks about developing features for specific classes of advanced materials and use them to train machine learning models and accelerate computational predictions for molecular compounds; developing method for organizing non homogeneous materials data; automate the process of using devices simulations to train machine learning models; dealing with scattered experimental data and use them to discover new patterns.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis work, a cosmic-ray telescope was set up in the INFN laboratories in Bologna using smaller size replicas of CMS Drift Tubes chambers, called MiniDTs, to test and develop new electronics for the CMS Phase-2 upgrade. The MiniDTs were assembled in INFN National Laboratory in Legnaro, Italy. Scintillator tiles complete the telescope, providing a signal independent of the MiniDTs for offline analysis. The telescope readout is a test system for the CMS Phase-2 upgrade data acquisition design. The readout is based on the early prototype of a radiation-hard FPGA-based board developed for the High Luminosity LHC CMS upgrade, called On Board electronics for Drift Tubes. Once the set-up was operational, we developed an online monitor to display in real-time the most important observables to check the quality of the data acquisition. We performed an offline analysis of the collected data using a custom version of CMS software tools, which allowed us to estimate the time pedestal and drift velocity in each chamber, evaluate the efficiency of the different DT cells, and measure the space and time resolution of the telescope system.