16 resultados para Multi-environment Trials
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
In the present study we are using multi variate analysis techniques to discriminate signal from background in the fully hadronic decay channel of ttbar events. We give a brief introduction to the role of the Top quark in the standard model and a general description of the CMS Experiment at LHC. We have used the CMS experiment computing and software infrastructure to generate and prepare the data samples used in this analysis. We tested the performance of three different classifiers applied to our data samples and used the selection obtained with the Multi Layer Perceptron classifier to give an estimation of the statistical and systematical uncertainty on the cross section measurement.
Resumo:
In this thesis, we deal with the design of experiments in the drug development process, focusing on the design of clinical trials for treatment comparisons (Part I) and the design of preclinical laboratory experiments for proteins development and manufacturing (Part II). In Part I we propose a multi-purpose design methodology for sequential clinical trials. We derived optimal allocations of patients to treatments for testing the efficacy of several experimental groups by also taking into account ethical considerations. We first consider exponential responses for survival trials and we then present a unified framework for heteroscedastic experimental groups that encompasses the general ANOVA set-up. The very good performance of the suggested optimal allocations, in terms of both inferential and ethical characteristics, are illustrated analytically and through several numerical examples, also performing comparisons with other designs proposed in the literature. Part II concerns the planning of experiments for processes composed of multiple steps in the context of preclinical drug development and manufacturing. Following the Quality by Design paradigm, the objective of the multi-step design strategy is the definition of the manufacturing design space of the whole process and, as we consider the interactions among the subsequent steps, our proposal ensures the quality and the safety of the final product, by enabling more flexibility and process robustness in the manufacturing.
Resumo:
The present thesis aims to provide a thorough comprehension of the vaginal ecosystem of pregnant women and enhance the knowledge of pregnancy pathophysiology. The first study emphasized the importance of limiting protein intake from animal sources, consuming carbohydrates, and avoiding starting pregnancy overweight to maintain a healthy vaginal environment characterized by lactobacilli and related metabolites. In the second paper, a reduction in bacterial diversity, an increase in Lactobacillus abundance, and a decrease in bacterial vaginosis-related genera were observed during pregnancy. Lactobacillus abundance correlated with higher levels of lactate, sarcosine, and amino acids, while bacterial vaginosis-related genera were associated with amines, formate, acetate, alcohols, and short-chain fatty acids. An association between intrapartum antibiotic prophylaxis for Group B Streptococcus and higher vaginal abundance of Prevotella was found. Moreover, women experiencing a first-trimester miscarriage displayed a higher abundance of Fusobacterium. The third study explored the presence of macrolides and tetracyclines resistance genes in the vaginal environment, highlighting that different vaginal microbiota types were associated with distinct resistance profiles. Lactobacilli-dominated ecosystems showed fewer or no resistance genes, while women with increased bacterial vaginosis-related genera were positive for resistance genes. The last two papers aimed to identify potential biomarkers of vaginal health or disease status. The fourth paper showed that positivity for Torquetenovirus decreased from the first to the third trimester, being more prevalent in women with higher vaginal leukocyte counts. Torquetenovirus-positive samples showed higher levels of cytokines, propionate, and cadaverine. Lactobacillus species decreased in Torquetenovirus-positive samples, while Sneathia and Shuttleworthia increased. The last work pointed out the association between clade 2 of Gardnerella vaginalis and bacterial vaginosis. Moreover, as the number of simultaneously detected G. vaginalis clades increased, bacterial vaginosis-associated bacteria also tended to increase. Additionally, sialidase gene levels negatively correlated with Lactobacillus and positively correlated with Gardnerella, Atopobium, Prevotella, Megasphaera, and Sneathia.
Resumo:
Unlike traditional wireless networks, characterized by the presence of last-mile, static and reliable infrastructures, Mobile ad Hoc Networks (MANETs) are dynamically formed by collections of mobile and static terminals that exchange data by enabling each other's communication. Supporting multi-hop communication in a MANET is a challenging research area because it requires cooperation between different protocol layers (MAC, routing, transport). In particular, MAC and routing protocols could be considered mutually cooperative protocol layers. When a route is established, the exposed and hidden terminal problems at MAC layer may decrease the end-to-end performance proportionally with the length of each route. Conversely, the contention at MAC layer may cause a routing protocol to respond by initiating new routes queries and routing table updates. Multi-hop communication may also benefit the presence of pseudo-centralized virtual infrastructures obtained by grouping nodes into clusters. Clustering structures may facilitate the spatial reuse of resources by increasing the system capacity: at the same time, the clustering hierarchy may be used to coordinate transmissions events inside the network and to support intra-cluster routing schemes. Again, MAC and clustering protocols could be considered mutually cooperative protocol layers: the clustering scheme could support MAC layer coordination among nodes, by shifting the distributed MAC paradigm towards a pseudo-centralized MAC paradigm. On the other hand, the system benefits of the clustering scheme could be emphasized by the pseudo-centralized MAC layer with the support for differentiated access priorities and controlled contention. In this thesis, we propose cross-layer solutions involving joint design of MAC, clustering and routing protocols in MANETs. As main contribution, we study and analyze the integration of MAC and clustering schemes to support multi-hop communication in large-scale ad hoc networks. A novel clustering protocol, named Availability Clustering (AC), is defined under general nodes' heterogeneity assumptions in terms of connectivity, available energy and relative mobility. On this basis, we design and analyze a distributed and adaptive MAC protocol, named Differentiated Distributed Coordination Function (DDCF), whose focus is to implement adaptive access differentiation based on the node roles, which have been assigned by the upper-layer's clustering scheme. We extensively simulate the proposed clustering scheme by showing its effectiveness in dominating the network dynamics, under some stressing mobility models and different mobility rates. Based on these results, we propose a possible application of the cross-layer MAC+Clustering scheme to support the fast propagation of alert messages in a vehicular environment. At the same time, we investigate the integration of MAC and routing protocols in large scale multi-hop ad-hoc networks. A novel multipath routing scheme is proposed, by extending the AOMDV protocol with a novel load-balancing approach to concurrently distribute the traffic among the multiple paths. We also study the composition effect of a IEEE 802.11-based enhanced MAC forwarding mechanism called Fast Forward (FF), used to reduce the effects of self-contention among frames at the MAC layer. The protocol framework is modelled and extensively simulated for a large set of metrics and scenarios. For both the schemes, the simulation results reveal the benefits of the cross-layer MAC+routing and MAC+clustering approaches over single-layer solutions.
Resumo:
Traditional software engineering approaches and metaphors fall short when applied to areas of growing relevance such as electronic commerce, enterprise resource planning, and mobile computing: such areas, in fact, generally call for open architectures that may evolve dynamically over time so as to accommodate new components and meet new requirements. This is probably one of the main reasons that the agent metaphor and the agent-oriented paradigm are gaining momentum in these areas. This thesis deals with the engineering of complex software systems in terms of the agent paradigm. This paradigm is based on the notions of agent and systems of interacting agents as fundamental abstractions for designing, developing and managing at runtime typically distributed software systems. However, today the engineer often works with technologies that do not support the abstractions used in the design of the systems. For this reason the research on methodologies becomes the basic point in the scientific activity. Currently most agent-oriented methodologies are supported by small teams of academic researchers, and as a result, most of them are in an early stage and still in the first context of mostly \academic" approaches for agent-oriented systems development. Moreover, such methodologies are not well documented and very often defined and presented only by focusing on specific aspects of the methodology. The role played by meta- models becomes fundamental for comparing and evaluating the methodologies. In fact a meta-model specifies the concepts, rules and relationships used to define methodologies. Although it is possible to describe a methodology without an explicit meta-model, formalising the underpinning ideas of the methodology in question is valuable when checking its consistency or planning extensions or modifications. A good meta-model must address all the different aspects of a methodology, i.e. the process to be followed, the work products to be generated and those responsible for making all this happen. In turn, specifying the work products that must be developed implies dening the basic modelling building blocks from which they are built. As a building block, the agent abstraction alone is not enough to fully model all the aspects related to multi-agent systems in a natural way. In particular, different perspectives exist on the role that environment plays within agent systems: however, it is clear at least that all non-agent elements of a multi-agent system are typically considered to be part of the multi-agent system environment. The key role of environment as a first-class abstraction in the engineering of multi-agent system is today generally acknowledged in the multi-agent system community, so environment should be explicitly accounted for in the engineering of multi-agent system, working as a new design dimension for agent-oriented methodologies. At least two main ingredients shape the environment: environment abstractions - entities of the environment encapsulating some functions -, and topology abstractions - entities of environment that represent the (either logical or physical) spatial structure. In addition, the engineering of non-trivial multi-agent systems requires principles and mechanisms for supporting the management of the system representation complexity. These principles lead to the adoption of a multi-layered description, which could be used by designers to provide different levels of abstraction over multi-agent systems. The research in these fields has lead to the formulation of a new version of the SODA methodology where environment abstractions and layering principles are exploited for en- gineering multi-agent systems.
Resumo:
As distributed collaborative applications and architectures are adopting policy based management for tasks such as access control, network security and data privacy, the management and consolidation of a large number of policies is becoming a crucial component of such policy based systems. In large-scale distributed collaborative applications like web services, there is the need of analyzing policy interactions and integrating policies. In this thesis, we propose and implement EXAM-S, a comprehensive environment for policy analysis and management, which can be used to perform a variety of functions such as policy property analyses, policy similarity analysis, policy integration etc. As part of this environment, we have proposed and implemented new techniques for the analysis of policies that rely on a deep study of state of the art techniques. Moreover, we propose an approach for solving heterogeneity problems that usually arise when considering the analysis of policies belonging to different domains. Our work focuses on analysis of access control policies written in the dialect of XACML (Extensible Access Control Markup Language). We consider XACML policies because XACML is a rich language which can represent many policies of interest to real world applications and is gaining widespread adoption in the industry.
Resumo:
Water distribution networks optimization is a challenging problem due to the dimension and the complexity of these systems. Since the last half of the twentieth century this field has been investigated by many authors. Recently, to overcome discrete nature of variables and non linearity of equations, the research has been focused on the development of heuristic algorithms. This algorithms do not require continuity and linearity of the problem functions because they are linked to an external hydraulic simulator that solve equations of mass continuity and of energy conservation of the network. In this work, a NSGA-II (Non-dominating Sorting Genetic Algorithm) has been used. This is a heuristic multi-objective genetic algorithm based on the analogy of evolution in nature. Starting from an initial random set of solutions, called population, it evolves them towards a front of solutions that minimize, separately and contemporaneously, all the objectives. This can be very useful in practical problems where multiple and discordant goals are common. Usually, one of the main drawback of these algorithms is related to time consuming: being a stochastic research, a lot of solutions must be analized before good ones are found. Results of this thesis about the classical optimal design problem shows that is possible to improve results modifying the mathematical definition of objective functions and the survival criterion, inserting good solutions created by a Cellular Automata and using rules created by classifier algorithm (C4.5). This part has been tested using the version of NSGA-II supplied by Centre for Water Systems (University of Exeter, UK) in MATLAB® environment. Even if orientating the research can constrain the algorithm with the risk of not finding the optimal set of solutions, it can greatly improve the results. Subsequently, thanks to CINECA help, a version of NSGA-II has been implemented in C language and parallelized: results about the global parallelization show the speed up, while results about the island parallelization show that communication among islands can improve the optimization. Finally, some tests about the optimization of pump scheduling have been carried out. In this case, good results are found for a small network, while the solutions of a big problem are affected by the lack of constraints on the number of pump switches. Possible future research is about the insertion of further constraints and the evolution guide. In the end, the optimization of water distribution systems is still far from a definitive solution, but the improvement in this field can be very useful in reducing the solutions cost of practical problems, where the high number of variables makes their management very difficult from human point of view.
Resumo:
It is well known that the deposition of gaseous pollutants and aerosols plays a major role in causing the deterioration of monuments and built cultural heritage in European cities. Despite of many studies dedicated to the environmental damage of cultural heritage, in case of cement mortars, commonly used in the 20th century architecture, the deterioration due to air multipollutants impact, especially the formation of black crusts, is still not well explored making this issue a challenging area of research. This work centers on cement mortars – environment interactions, focusing on the diagnosis of the damage on the modern built heritage due to air multi-pollutants. For this purpose three sites, exposed to different urban areas in Europe, were selected for sampling and subsequent laboratory analyses: Centennial Hall, Wroclaw (Poland), Chiesa dell'Autostrada del Sole, Florence (Italy), Casa Galleria Vichi, Florence (Italy). The sampling sessions were performed taking into account the height from the ground level and protection from rain run off (sheltered, partly sheltered and exposed areas). The complete characterization of collected damage layer and underlying materials was performed using a range of analytical techniques: optical and scanning electron microscopy, X ray diffractometry, differential and gravimetric thermal analysis, ion chromatography, flash combustion/gas chromatographic analysis, inductively coupled plasma-optical emission spectrometer. The data were elaborated using statistical methods (i.e. principal components analyses) and enrichment factor for cement mortars was calculated for the first time. The results obtained from the experimental activity performed on the damage layers indicate that gypsum, due to the deposition of atmospheric sulphur compounds, is the main damage product at surfaces sheltered from rain run-off at Centennial Hall and Casa Galleria Vichi. By contrast, gypsum has not been identified in the samples collected at Chiesa dell'Autostrada del Sole. This is connected to the restoration works, particularly surface cleaning, regularly performed for the maintenance of the building. Moreover, the results obtained demonstrated the correlation between the location of the building and the composition of the damage layer: Centennial Hall is mainly undergoing to the impact of pollutants emitted from the close coal power stations, whilst Casa Galleria Vichi is principally affected by pollutants from vehicular exhaust in front of the building.
Resumo:
Biomedical analyses are becoming increasingly complex, with respect to both the type of the data to be produced and the procedures to be executed. This trend is expected to continue in the future. The development of information and protocol management systems that can sustain this challenge is therefore becoming an essential enabling factor for all actors in the field. The use of custom-built solutions that require the biology domain expert to acquire or procure software engineering expertise in the development of the laboratory infrastructure is not fully satisfactory because it incurs undesirable mutual knowledge dependencies between the two camps. We propose instead an infrastructure concept that enables the domain experts to express laboratory protocols using proper domain knowledge, free from the incidence and mediation of the software implementation artefacts. In the system that we propose this is made possible by basing the modelling language on an authoritative domain specific ontology and then using modern model-driven architecture technology to transform the user models in software artefacts ready for execution in a multi-agent based execution platform specialized for biomedical laboratories.
Resumo:
The thesis objectives are to develop new methodologies for study of the space and time variability of Italian upper ocean ecosystem through the combined use of multi-sensors satellite data and in situ observations and to identify the capability and limits of remote sensing observations to monitor the marine state at short and long time scales. Three oceanographic basins have been selected and subjected to different types of analyses. The first region is the Tyrrhenian Sea where a comparative analysis of altimetry and lagrangian measurements was carried out to study the surface circulation. The results allowed to deepen the knowledge of the Tyrrhenian Sea surface dynamics and its variability and to defined the limitations of satellite altimetry measurements to detect small scale marine circulation features. Channel of Sicily study aimed to identify the spatial-temporal variability of phytoplankton biomass and to understand the impact of the upper ocean circulation on the marine ecosystem. An combined analysis of the satellite of long term time series of chlorophyll, Sea Surface Temperature and Sea Level field data was applied. The results allowed to identify the key role of the Atlantic water inflow in modulating the seasonal variability of the phytoplankton biomass in the region. Finally, Italian coastal marine system was studied with the objective to explore the potential capability of Ocean Color data in detecting chlorophyll trend in coastal areas. The most appropriated methodology to detect long term environmental changes was defined through intercomparison of chlorophyll trends detected by in situ and satellite. Then, Italian coastal areas subject to eutrophication problems were identified. This work has demonstrated that satellites data constitute an unique opportunity to define the features and forcing influencing the upper ocean ecosystems dynamics and can be used also to monitor environmental variables capable of influencing phytoplankton productivity.
Resumo:
This thesis deals with heterogeneous architectures in standard workstations. Heterogeneous architectures represent an appealing alternative to traditional supercomputers because they are based on commodity components fabricated in large quantities. Hence their price-performance ratio is unparalleled in the world of high performance computing (HPC). In particular, different aspects related to the performance and consumption of heterogeneous architectures have been explored. The thesis initially focuses on an efficient implementation of a parallel application, where the execution time is dominated by an high number of floating point instructions. Then the thesis touches the central problem of efficient management of power peaks in heterogeneous computing systems. Finally it discusses a memory-bounded problem, where the execution time is dominated by the memory latency. Specifically, the following main contributions have been carried out: A novel framework for the design and analysis of solar field for Central Receiver Systems (CRS) has been developed. The implementation based on desktop workstation equipped with multiple Graphics Processing Units (GPUs) is motivated by the need to have an accurate and fast simulation environment for studying mirror imperfection and non-planar geometries. Secondly, a power-aware scheduling algorithm on heterogeneous CPU-GPU architectures, based on an efficient distribution of the computing workload to the resources, has been realized. The scheduler manages the resources of several computing nodes with a view to reducing the peak power. The two main contributions of this work follow: the approach reduces the supply cost due to high peak power whilst having negligible impact on the parallelism of computational nodes. from another point of view the developed model allows designer to increase the number of cores without increasing the capacity of the power supply unit. Finally, an implementation for efficient graph exploration on reconfigurable architectures is presented. The purpose is to accelerate graph exploration, reducing the number of random memory accesses.
Resumo:
Nowadays the rise of non-recurring engineering (NRE) costs associated with complexity is becoming a major factor in SoC design, limiting both scaling opportunities and the flexibility advantages offered by the integration of complex computational units. The introduction of embedded programmable elements can represent an appealing solution, able both to guarantee the desired flexibility and upgradabilty and to widen the SoC market. In particular embedded FPGA (eFPGA) cores can provide bit-level optimization for those applications which benefits from synthesis, paying on the other side in terms of performance penalties and area overhead with respect to standard cell ASIC implementations. In this scenario this thesis proposes a design methodology for a synthesizable programmable device designed to be embedded in a SoC. A soft-core embedded FPGA (eFPGA) is hence presented and analyzed in terms of the opportunities given by a fully synthesizable approach, following an implementation flow based on Standard-Cell methodology. A key point of the proposed eFPGA template is that it adopts a Multi-Stage Switching Network (MSSN) as the foundation of the programmable interconnects, since it can be efficiently synthesized and optimized through a standard cell based implementation flow, ensuring at the same time an intrinsic congestion-free network topology. The evaluation of the flexibility potentialities of the eFPGA has been performed using different technology libraries (STMicroelectronics CMOS 65nm and BCD9s 0.11μm) through a design space exploration in terms of area-speed-leakage tradeoffs, enabled by the full synthesizability of the template. Since the most relevant disadvantage of the adopted soft approach, compared to a hardcore, is represented by a performance overhead increase, the eFPGA analysis has been made targeting small area budgets. The generation of the configuration bitstream has been obtained thanks to the implementation of a custom CAD flow environment, and has allowed functional verification and performance evaluation through an application-aware analysis.
Resumo:
This thesis deals with the analytic study of dynamics of Multi--Rotor Unmanned Aerial Vehicles. It is conceived to give a set of mathematical instruments apt to the theoretical study and design of these flying machines. The entire work is organized in analogy with classical academic texts about airplane flight dynamics. First, the non--linear equations of motion are defined and all the external actions are modeled, with particular attention to rotors aerodynamics. All the equations are provided in a form, and with personal expedients, to be directly exploitable in a simulation environment. This has requited an answer to questions like the trim of such mathematical systems. All the treatment is developed aiming at the description of different multi--rotor configurations. Then, the linearized equations of motion are derived. The computation of the stability and control derivatives of the linear model is carried out. The study of static and dynamic stability characteristics is, thus, addressed, showing the influence of the various geometric and aerodynamic parameters of the machine and in particular of the rotors. All the theoretic results are finally utilized in two interesting cases. One concerns the design of control systems for attitude stabilization. The linear model permits the tuning of linear controllers gains and the non--linear model allows the numerical testing. The other case is the study of the performances of an innovative configuration of quad--rotor aircraft. With the non--linear model the feasibility of maneuvers impossible for a traditional quad--rotor is assessed. The linear model is applied to the controllability analysis of such an aircraft in case of actuator block.
Resumo:
Le conseguenze del management algoritmico sui lavoratori sono note tra gli studiosi, ma poche ricerche indagano le possibilità di agency, soprattutto a livello individuale, nella gig-economy. A partire dalla quotidianità del lavoro, l’obiettivo è analizzare le forme di agency esercitate dai platform workers nel settore della logistica dell'ultimo miglio. La ricerca si basa su un'etnografia multi-situata condotta in due paesi molto distanti e riguardante due diversi servizi urbani di piattaforma: il food-delivery in Italia (Bologna, Torino) e il ride-hailing in Argentina (Buenos Aires). Nonostante le differenze, il lavoro di campo ha mostrato diverse continuità tra i contesti geografici. Innanzitutto, le tecnologie digitali giocano un ruolo ambivalente nell'ambiente di lavoro: se la tecnologia è usata dalle aziende per disciplinare il lavoro, costituisce però anche uno strumento che può essere impiegato a vantaggio dei lavoratori. Sia nel ride-hailing che nelle piattaforme di food-delivery, infatti, i lavoratori esprimono la loro agency condividendo pratiche di rimaneggiamento e tattiche per aggirare il despotismo algoritmico. In secondo luogo, la ricerca ha portato alla luce una gran varietà di attività economiche sviluppate ai margini dell'economia di piattaforma. In entrambi i casi le piattaforme intersecano vivacemente le economie informali urbane e alimentano circuiti informali di lavoro, come evidenziato dall'elevata presenza di scambi illeciti: ad esempio, vendita di account, hacking-bots, caporalato digitale. Tutt'altro che avviare un processo di formalizzazione, quindi, la piattaforma sussume e riproduce l’insieme di condizioni produttive e riproduttive dell'informalità (viração), offrendo impieghi intermittenti e insicuri a una massa di lavoratori-usa-e-getta disponibile al sottoimpiego. In conclusione, le piattaforme vengono definite come infrastrutture barocche, intendendo con il barocco tanto la natura ibrida dell'azione che mescola forme di neoliberismo-dal-basso con pratiche di solidarietà tra pari, quanto la progressiva ristrutturazione dei processi di accumulazione all’insegna di una rinnovata interdipendenza tra formale e informale nelle infrastrutture del «mondo a domicilio».
Resumo:
Machine (and deep) learning technologies are more and more present in several fields. It is undeniable that many aspects of our society are empowered by such technologies: web searches, content filtering on social networks, recommendations on e-commerce websites, mobile applications, etc., in addition to academic research. Moreover, mobile devices and internet sites, e.g., social networks, support the collection and sharing of information in real time. The pervasive deployment of the aforementioned technological instruments, both hardware and software, has led to the production of huge amounts of data. Such data has become more and more unmanageable, posing challenges to conventional computing platforms, and paving the way to the development and widespread use of the machine and deep learning. Nevertheless, machine learning is not only a technology. Given a task, machine learning is a way of proceeding (a way of thinking), and as such can be approached from different perspectives (points of view). This, in particular, will be the focus of this research. The entire work concentrates on machine learning, starting from different sources of data, e.g., signals and images, applied to different domains, e.g., Sport Science and Social History, and analyzed from different perspectives: from a non-data scientist point of view through tools and platforms; setting a problem stage from scratch; implementing an effective application for classification tasks; improving user interface experience through Data Visualization and eXtended Reality. In essence, not only in a quantitative task, not only in a scientific environment, and not only from a data-scientist perspective, machine (and deep) learning can do the difference.