862 resultados para computing systems design
Resumo:
n decentralised rural electrification through solar home systems, private companies and promoting institutions are faced with the problem of deploying maintenance structures to operate and guarantee the service of the solar systems for long periods (ten years or more). The problems linked to decentralisation, such as the dispersion of dwellings, difficult access and maintenance needs, makes it an arduous task. This paper proposes an innovative design tool created ad hoc for photovoltaic rural electrification based on a real photovoltaic rural electrification program in Morocco as a special case study. The tool is developed from a mathematical model comprising a set of decision variables (location, transport, etc.) that must meet certain constraints and whose optimisation criterion is the minimum cost of the operation and maintenance activity assuming an established quality of service. The main output of the model is the overall cost of the maintenance structure. The best location for the local maintenance headquarters and warehouses in a given region is established, as are the number of maintenance technicians and vehicles required.
Resumo:
In knowledge technology work, as expressed by the scope of this conference, there are a number of communities, each uncovering new methods, theories, and practices. The Library and Information Science (LIS) community is one such community. This community, through tradition and innovation, theories and practice, organizes knowledge and develops knowledge technologies formed by iterative research hewn to the values of equal access and discovery for all. The Information Modeling community is another contributor to knowledge technologies. It concerns itself with the construction of symbolic models that capture the meaning of information and organize it in ways that are computer-based, but human understandable. A recent paper that examines certain assumptions in information modeling builds a bridge between these two communities, offering a forum for a discussion on common aims from a common perspective. In a June 2000 article, Parsons and Wand separate classes from instances in information modeling in order to free instances from what they call the “tyranny” of classes. They attribute a number of problems in information modeling to inherent classification – or the disregard for the fact that instances can be conceptualized independent of any class assignment. By faceting instances from classes, Parsons and Wand strike a sonorous chord with classification theory as understood in LIS. In the practice community and in the publications of LIS, faceted classification has shifted the paradigm of knowledge organization theory in the twentieth century. Here, with the proposal of inherent classification and the resulting layered information modeling, a clear line joins both the LIS classification theory community and the information modeling community. Both communities have their eyes turned toward networked resource discovery, and with this conceptual conjunction a new paradigmatic conversation can take place. Parsons and Wand propose that the layered information model can facilitate schema integration, schema evolution, and interoperability. These three spheres in information modeling have their own connotation, but are not distant from the aims of classification research in LIS. In this new conceptual conjunction, established by Parsons and Ward, information modeling through the layered information model, can expand the horizons of classification theory beyond LIS, promoting a cross-fertilization of ideas on the interoperability of subject access tools like classification schemes, thesauri, taxonomies, and ontologies. This paper examines the common ground between the layered information model and faceted classification, establishing a vocabulary and outlining some common principles. It then turns to the issue of schema and the horizons of conventional classification and the differences between Information Modeling and Library and Information Science. Finally, a framework is proposed that deploys an interpretation of the layered information modeling approach in a knowledge technologies context. In order to design subject access systems that will integrate, evolve and interoperate in a networked environment, knowledge organization specialists must consider a semantic class independence like Parsons and Wand propose for information modeling.
Resumo:
Internet of Things systems are pervasive systems evolved from cyber-physical to large-scale systems. Due to the number of technologies involved, software development involves several integration challenges. Among them, the ones preventing proper integration are those related to the system heterogeneity, and thus addressing interoperability issues. From a software engineering perspective, developers mostly experience the lack of interoperability in the two phases of software development: programming and deployment. On the one hand, modern software tends to be distributed in several components, each adopting its most-appropriate technology stack, pushing programmers to code in a protocol- and data-agnostic way. On the other hand, each software component should run in the most appropriate execution environment and, as a result, system architects strive to automate the deployment in distributed infrastructures. This dissertation aims to improve the development process by introducing proper tools to handle certain aspects of the system heterogeneity. Our effort focuses on three of these aspects and, for each one of those, we propose a tool addressing the underlying challenge. The first tool aims to handle heterogeneity at the transport and application protocol level, the second to manage different data formats, while the third to obtain optimal deployment. To realize the tools, we adopted a linguistic approach, i.e.\ we provided specific linguistic abstractions that help developers to increase the expressive power of the programming language they use, writing better solutions in more straightforward ways. To validate the approach, we implemented use cases to show that the tools can be used in practice and that they help to achieve the expected level of interoperability. In conclusion, to move a step towards the realization of an integrated Internet of Things ecosystem, we target programmers and architects and propose them to use the presented tools to ease the software development process.
Resumo:
The trend related to the turnover of internal combustion engine vehicles with EVs goes by the name of electrification. The push electrification experienced in the last decade is linked to the still ongoing evolution in power electronics technology for charging systems. This is the reason why an evolution in testing strategies and testing equipment is crucial too. The project this dissertation is based on concerns the investigation of a new EV simulator design. that optimizes the structure of the testing equipment used by the company who commissioned this work. Project requirements can be summarized in the following two points: space occupation reduction and parallel charging implementation. Some components were completely redesigned, and others were substituted with equivalent ones that could perform the same tasks. In this way it was possible to reduce the space occupation of the simulator, as well as to increase the efficiency of the testing device. Moreover, the possibility of conjugating different charging simulations could be investigated by parallelly launching two testing procedures on a unique machine, properly predisposed for supporting the two charging protocols used. On the back of the results achieved in the body of this dissertation, a new design for the EV simulator was proposed. In this way, space reduction was obtained, and space occupation efficiency was improved with the proposed new design. The testing device thus resulted to be way more compact, enabling to gain in safety and productivity, along with a 25% cost reduction. Furthermore, parallel charging was implemented in the proposed new design since the conducted tests clearly showed the feasibility of parallel charging sessions. The results presented in this work can thus be implemented to build the first prototype of the new EV simulator.
Resumo:
Power-to-Gas storage systems have the potential to address grid-stability issues that arise when an increasing share of power is generated from sources that have a highly variable output. Although the proof-of-concept of these has been promising, the behaviour of the processes in off-design conditions is not easily predictable. The primary aim of this PhD project was to evaluate the performance of an original Power-to-Gas system, made up of innovative components. To achieve this, a numerical model has been developed to simulate the characteristics and the behaviour of the several components when the whole system is coupled with a renewable source. The developed model has been applied to a large variety of scenarios, evaluating the performance of the considered process and exploiting a limited amount of experimental data. The model has been then used to compare different Power-to-Gas concepts, in a real scenario of functioning. Several goals have been achieved. In the concept phase, the possibility to thermally integrate the high temperature components has been demonstrated. Then, the parameters that affect the energy performance of a Power-to-Gas system coupled with a renewable source have been identified, providing general recommendations on the design of hybrid systems; these parameters are: 1) the ratio between the storage system size and the renewable generator size; 2) the type of coupled renewable source; 3) the related production profile. Finally, from the results of the comparative analysis, it is highlighted that configurations with a highly oversized renewable source with respect to the storage system show the maximum achievable profit.
Resumo:
Conventional chromatographic columns are packed with porous beads by the universally employed slurry-packing method. The lack of precise control of the particle size distribution, shape and position inside the column have dramatic effects on the separation efficiency. In the first part the thesis an ordered, three-dimensional, pillar-array structure was designed by a CAD software. Several columns, characterized by different fluid distributors and bed length, were produced by a stereolithographic 3D printer and compared in terms of pressure drop and height equivalent to a theroretical plate (HETP). To prevent the release of unwanted substances and to provide a surface for immobilizing a ligand, pillars were coated with one or more of the following materials: titanium dioxide, nanofibrillated cellulose (NFC) and polystyrene. The external NFC layer was functionalized with Cibacron Blue and the dynamic binding capacity of the column was measured by performing three chromatographic cycles, using bovine serum albumin (BSA) as target molecule. The second part of the thesis deals with Covid-19 pandemic related research activities. In early 2020, due to the pandemic outbreak, surgical face masks became an essential non-pharmaceutical intervention to limit the spread. To address the consequent shortage and to support the reconversion of the Italian industry, in late March 2020 a multidisciplinary group of the University of Bologna created the first Italian laboratory able to perform all the tests required for the evaluation and certification of surgical masks. More than 1200 tests were performed on about 350 prototypes, according to the standard EN 14683:2019. The results were analyzed to define the best material properties and masks composition for the production of masks with excellent efficiency. To optimize the usage of surgical masks and to reduce their environmental burden, the variation of their performance over time of usage were investigated as to determine the maximum lifetime.
Resumo:
The General Data Protection Regulation (GDPR) has been designed to help promote a view in favor of the interests of individuals instead of large corporations. However, there is the need of more dedicated technologies that can help companies comply with GDPR while enabling people to exercise their rights. We argue that such a dedicated solution must address two main issues: the need for more transparency towards individuals regarding the management of their personal information and their often hindered ability to access and make interoperable personal data in a way that the exercise of one's rights would result in straightforward. We aim to provide a system that helps to push personal data management towards the individual's control, i.e., a personal information management system (PIMS). By using distributed storage and decentralized computing networks to control online services, users' personal information could be shifted towards those directly concerned, i.e., the data subjects. The use of Distributed Ledger Technologies (DLTs) and Decentralized File Storage (DFS) as an implementation of decentralized systems is of paramount importance in this case. The structure of this dissertation follows an incremental approach to describing a set of decentralized systems and models that revolves around personal data and their subjects. Each chapter of this dissertation builds up the previous one and discusses the technical implementation of a system and its relation with the corresponding regulations. We refer to the EU regulatory framework, including GDPR, eIDAS, and Data Governance Act, to build our final system architecture's functional and non-functional drivers. In our PIMS design, personal data is kept in a Personal Data Space (PDS) consisting of encrypted personal data referring to the subject stored in a DFS. On top of that, a network of authorization servers acts as a data intermediary to provide access to potential data recipients through smart contracts.
Resumo:
Nowadays, technological advancements have brought industry and research towards the automation of various processes. Automation brings a reduction in costs and an improvement in product quality. For this reason, companies are pushing research to investigate new technologies. The agriculture industry has always looked towards automating various processes, from product processing to storage. In the last years, the automation of harvest and cultivation phases also has become attractive, pushed by the advancement of autonomous driving. Nevertheless, ADAS systems are not enough. Merging different technologies will be the solution to obtain total automation of agriculture processes. For example, sensors that estimate products' physical and chemical properties can be used to evaluate the maturation level of fruit. Therefore, the fusion of these technologies has a key role in industrial process automation. In this dissertation, ADAS systems and sensors for precision agriculture will be both treated. Several measurement procedures for characterizing commercial 3D LiDARs will be proposed and tested to cope with the growing need for comparison tools. Axial errors and transversal errors have been investigated. Moreover, a measurement method and setup for evaluating the fog effect on 3D LiDARs will be proposed. Each presented measurement procedure has been tested. The obtained results highlight the versatility and the goodness of the proposed approaches. Regarding the precision agriculture sensors, a measurement approach for the Moisture Content and density estimation of crop directly on the field is presented. The approach regards the employment of a Near Infrared spectrometer jointly with Partial Least Square statistical analysis. The approach and the model will be described together with a first laboratory prototype used to evaluate the NIRS approach. Finally, a prototype for on the field analysis is realized and tested. The test results are promising, evidencing that the proposed approach is suitable for Moisture Content and density estimation.
Resumo:
This Thesis wants to highlight the importance of ad-hoc designed and developed embedded systems in the implementation of intelligent sensor networks. As evidence four areas of application are presented: Precision Agriculture, Bioengineering, Automotive and Structural Health Monitoring. For each field is reported one, or more, smart device design and developing, in addition to on-board elaborations, experimental validation and in field tests. In particular, it is presented the design and development of a fruit meter. In the bioengineering field, three different projects are reported, detailing the architectures implemented and the validation tests conducted. Two prototype realizations of an inner temperature measurement system in electric motors for an automotive application are then discussed. Lastly, the HW/SW design of a Smart Sensor Network is analyzed: the network features on-board data management and processing, integration in an IoT toolchain, Wireless Sensor Network developments and an AI framework for vibration-based structural assessment.
Resumo:
In the frame of inductive power transfer (IPT) systems, arrays of magnetically coupled resonators have received increasing attention as they are cheap and versatile due to their simple structure. They consist of magnetically coupled coils, which resonate with their self-capacitance or lumped capacitive networks. Of great industrial interest are planar resonator arrays used to power a receiver that can be placed at any position above the array. A thorough circuit analysis has been carried out, first starting from traditional two-coil IPT devices. Then, resonator arrays have been introduced, with particular attention to the case of arrays with a receiver. To evaluate the system performance, a circuit model based on original analytical formulas has been developed and experimentally validated. The results of the analysis also led to the definition of a new doubly-fed array configuration with a receiver that can be placed above it at any position. A suitable control strategy aimed at maximising the transmitted power and the efficiency has been also proposed. The study of the array currents has been carried out resorting to the theory of magneto-inductive waves, allowing useful insight to be highlighted. The analysis has been completed with a numerical and experimental study on the magnetic field distribution originating from the array. Furthermore, an application of the resonator array as a position sensor has been investigated. The position of the receiver is estimated through the measurement of the array input impedance, for which an original analytical expression has been also obtained. The application of this sensing technique in an automotive dynamic IPT system has been discussed. The thesis concludes with an evaluation of the possible applications of two-dimensional resonator arrays in IPT systems. These devices can be used to improve system efficiency and transmitted power, as well as for magnetic field shielding.
Resumo:
The objective of the thesis project, developed within the Line Control & Software Engineering team of G.D company, is to analyze and identify the appropriate tool to automate the HW configuration process using Beckhoff technologies by importing data from an ECAD tool. This would save a great deal of time, since the I/O topology created as part of the electrical planning is presently imported manually in the related SW project of the machine. Moreover, a manual import is more error-prone because of human mistake than an automatic configuration tool. First, an introduction about TwinCAT 3, EtherCAT and Automation Interface is provided; then, it is analyzed the official Beckhoff tool, XCAD Interface, and the requirements on the electrical planning to use it: the interface is realized by means of the AutomationML format. Finally, due to some limitations observed, the design and implementation of a company internal tool is performed. Tests and validation of the tool are performed on a sample production line of the company.
Resumo:
One of the major issues for power converters that are connected to the electric grid are the measurement of three phase Conduced Emissions (CE), which are regulated by international and regional standards. CE are composed of two components which are Common Mode (CM) noise and Differential Mode (DM) noise. To achieve compliance with these regulations the Equipment Under Test (EUT) includes filtering and other electromagnetic emission control strategies. The separation of differential mode and common mode noise in Electromagnetic Interference (EMI) analysis is a well-known procedure which is useful especially for the optimization of the EMI filter, to improve the CM or DM attenuation depending on which component of the conducted emissions is predominant, and for the analysis and the understanding of interference phenomena of switched mode power converters. However, separating both components is rarely done during measurements. Therefore, in this thesis an active device for the separation of the CM and DM EMI noise in three phase power electronic systems has been designed and experimentally analysed.
Resumo:
The use of composite resins in dentistry is well accepted for restoring anterior and posterior teeth. Many polishing protocols have been evaluated for their effect on the surface roughness of restorative materials. This study compared the effect of different polishing systems on the surface roughness of microhybrid composites. Thirty-six specimens were prepared for each composite $#91;Charisma® (Heraeus Kulzer), Fill Magic® (Vigodent), TPH Spectrum® (Dentsply), Z100® (3M/ESPE) and Z250® (3M/ESPE)] and submitted to surface treatment with Enhance® and PoGo® (Dentsply) points, sequential Sof-Lex XT® aluminum oxide disks (3M/ESPE), and felt disks (TDV) combined with Excel® diamond polishing paste (TDV). Average surface roughness (Ra) was measured with a mechanical roughness tester. The data were analyzed by two-way ANOVA with repetition of the factorial design and the Tukey-Kramer test (p<0.01). The F-test result for treatments and resins was high (p<0.0001 for both), indicating that the effect of the treatment applied to the specimen surface and the effect of the type of resin on surface roughness was highly significant. Regarding the interaction between polishing system and type of resin used, a p value of 0.0002 was obtained, indicating a statistically significant difference. A Ra of 1.3663 was obtained for the Sof-Lex/TPH Spectrum interaction. In contrast, the Ra for the felt disk+paste/Z250 interactions was 0.1846. In conclusion, Sof-Lex polishing system produced a higher surface roughness on TPH Spectrum resin when compared to the other interactions.
Resumo:
Background: The MASS IV-DM Trial is a large project from a single institution, the Heart Institute (InCor), University of Sao Paulo Medical School, Brazil to study ventricular function and coronary arteries in patients with type 2 diabetes mellitus. Methods/Design: The study will enroll 600 patients with type 2 diabetes who have angiographically normal ventricular function and coronary arteries. The goal of the MASS IV-DM Trial is to achieve a long-term evaluation of the development of coronary atherosclerosis by using angiograms and coronary-artery calcium scan by electron-beam computed tomography at baseline and after 5 years of follow-up. In addition, the incidence of major cardiovascular events, the dysfunction of various organs involved in this disease, particularly microalbuminuria and renal function, will be analyzed through clinical evaluation. In addition, an effort will be made to investigate in depth the presence of major cardiovascular risk factors, especially the biochemical profile, metabolic syndrome inflammatory activity, oxidative stress, endothelial function, prothrombotic factors, and profibrinolytic and platelet activity. An evaluation will be made of the polymorphism as a determinant of disease and its possible role in the genesis of micro- and macrovascular damage. Discussion: The MASS IV-DM trial is designed to include diabetic patients with clinically suspected myocardial ischemia in whom conventional angiography shows angiographically normal coronary arteries. The result of extensive investigation including angiographic follow-up by several methods, vascular reactivity, pro-thrombotic mechanisms, genetic and biochemical studies may facilitate the understanding of so-called micro- and macrovascular disease of DM.
Resumo:
Background: High-density tiling arrays and new sequencing technologies are generating rapidly increasing volumes of transcriptome and protein-DNA interaction data. Visualization and exploration of this data is critical to understanding the regulatory logic encoded in the genome by which the cell dynamically affects its physiology and interacts with its environment. Results: The Gaggle Genome Browser is a cross-platform desktop program for interactively visualizing high-throughput data in the context of the genome. Important features include dynamic panning and zooming, keyword search and open interoperability through the Gaggle framework. Users may bookmark locations on the genome with descriptive annotations and share these bookmarks with other users. The program handles large sets of user-generated data using an in-process database and leverages the facilities of SQL and the R environment for importing and manipulating data. A key aspect of the Gaggle Genome Browser is interoperability. By connecting to the Gaggle framework, the genome browser joins a suite of interconnected bioinformatics tools for analysis and visualization with connectivity to major public repositories of sequences, interactions and pathways. To this flexible environment for exploring and combining data, the Gaggle Genome Browser adds the ability to visualize diverse types of data in relation to its coordinates on the genome. Conclusions: Genomic coordinates function as a common key by which disparate biological data types can be related to one another. In the Gaggle Genome Browser, heterogeneous data are joined by their location on the genome to create information-rich visualizations yielding insight into genome organization, transcription and its regulation and, ultimately, a better understanding of the mechanisms that enable the cell to dynamically respond to its environment.