15 resultados para Variable design parameters

em AMS Tesi di Laurea - Alm@DL - Università di Bologna


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Cloud computing enables independent end users and applications to share data and pooled resources, possibly located in geographically distributed Data Centers, in a fully transparent way. This need is particularly felt by scientific applications to exploit distributed resources in efficient and scalable way for the processing of big amount of data. This paper proposes an open so- lution to deploy a Platform as a service (PaaS) over a set of multi- site data centers by applying open source virtualization tools to facilitate operation among virtual machines while optimizing the usage of distributed resources. An experimental testbed is set up in Openstack environment to obtain evaluations with different types of TCP sample connections to demonstrate the functionality of the proposed solution and to obtain throughput measurements in relation to relevant design parameters.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Object of this thesis has been centrifuge modelling of earth reinforced retaining walls with modular blocks facing in order to investigate on the influence of design parameters, such as length and vertical spacing of reinforcement, on the behaviour of the structure. In order to demonstrate, 11 models were tested, each one with different length of reinforcement or spacing. Each model was constructed and then placed in the centrifuge in order to artificially raise gravitational acceleration up to 35 g, reproducing the soil behaviour of a 5 metre high wall. Vertical and horizontal displacements were recorded by means of a special device which enabled tracking of deformations in the structure along its longitudinal cross section, essentially drawing its deformed shape. As expected, results confirmed reinforcement parameters to be the governing factor in the behaviour of earth reinforced structures since increase in length and spacing improved structural stability. However, the influence of the length was found out to be the leading parameter, reducing facial deformations up to five times, and the spacing playing an important role especially in unstable configurations. When failure occurred, failure surface was characterised by the same shape (circular) and depth, regardless of the reinforcement configuration. Furthermore, results confirmed the over-conservatism of codes, since models with reinforcement layers 0.4H long showed almost negligible deformations. Although the experiments performed were consistent and yielded replicable results, further numerical modelling may allow investigation on other issues, such as the influence of the reinforcement stiffness, facing stiffness and varying backfills.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Nowadays the number of hip joints arthroplasty operations continues to increase because the elderly population is growing. Moreover, the global life expectancy is increasing and people adopt a more active way of life. For this reasons, the demand of implant revision operations is becoming more frequent. The operation procedure includes the surgical removal of the old implant and its substitution with a new one. Every time a new implant is inserted, it generates an alteration in the internal femur strain distribution, jeopardizing the remodeling process with the possibility of bone tissue loss. This is of major concern, particularly in the proximal Gruen zones, which are considered critical for implant stability and longevity. Today, different implant designs exist in the market; however there is not a clear understanding of which are the best implant design parameters to achieve mechanical optimal conditions. The aim of the study is to investigate the stress shielding effect generated by different implant design parameters on proximal femur, evaluating which ranges of those parameters lead to the most physiological conditions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this thesis the design of a pressure regulation system for space propulsion engines (electric and cold gas) has been performed. The Bang-Bang Control (BBC) method has been implemented through the open/close command on a solenoid valve, and the mass flow rate of the propellant has been fixed with suitable flow restrictors. At the beginning, research for the comparison between mechanical and electronic (for BBC) pressure regulators has been performed, which resulted in enough advantages for the selection of the second valve type. The major advantage is about the possibility to have a variable outlet pressure with a variable inlet pressure through a simple remote command, while in mechanical pressure regulators the ratio between inlet and outlet pressures must be mechanically settled. Different pressure control schemes have been analyzed, changing number of solenoid valves, flow restrictors and plenums. For each scheme the valve’s frequencies were evaluated with simplified mathematical models and with the use of simulators implemented on Python; the results obtained from those two methods matched quiet well. From all the schemes it was possible to observe varying frequency and duty cycle, for changes in different parameters. This results, after experimental checks, can be used to design the control system for a given total number of cycles that a specific solenoid valve can guarantee. Finally, tests were performed and it was possible to verify the goodness of the control system. Moreover from the tests it was possible to deduce some tips in order to optimize the running of the simulator.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of the work is: define and calculate a factor of collapse related to traditional method to design sheet pile walls. Furthermore, we tried to find the parameters that most influence a finite element model representative of this problem. The text is structured in this way: from chapter 1 to 5, we analyzed a series of arguments which are usefull to understanding the problem, while the considerations mainly related to the purpose of the text are reported in the chapters from 6 to 10. In the first part of the document the following arguments are shown: what is a sheet pile wall, what are the codes to be followed for the design of these structures and what they say, how can be formulated a mathematical model of the soil, some fundamentals of finite element analysis, and finally, what are the traditional methods that support the design of sheet pile walls. In the chapter 6 we performed a parametric analysis, giving an answer to the second part of the purpose of the work. Comparing the results from a laboratory test for a cantilever sheet pile wall in a sandy soil, with those provided by a finite element model of the same problem, we concluded that:in modelling a sandy soil we should pay attention to the value of cohesion that we insert in the model (some programs, like Abaqus, don’t accept a null value for this parameter), friction angle and elastic modulus of the soil, they influence significantly the behavior of the system (structure-soil), others parameters, like the dilatancy angle or the Poisson’s ratio, they don’t seem influence it. The logical path that we followed in the second part of the text is reported here. We analyzed two different structures, the first is able to support an excavation of 4 m, while the second an excavation of 7 m. Both structures are first designed by using the traditional method, then these structures are implemented in a finite element program (Abaqus), and they are pushed to collapse by decreasing the friction angle of the soil. The factor of collapse is the ratio between tangents of the initial friction angle and of the friction angle at collapse. At the end, we performed a more detailed analysis of the first structure, observing that, the value of the factor of collapse is influenced by a wide range of parameters including: the value of the coefficients assumed in the traditional method and by the relative stiffness of the structure-soil system. In the majority of cases, we found that the value of the factor of collapse is between and 1.25 and 2. With some considerations, reported in the text, we can compare the values so far found, with the value of the safety factor proposed by the code (linked to the friction angle of the soil).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the recent decade, the request for structural health monitoring expertise increased exponentially in the United States. The aging issues that most of the transportation structures are experiencing can put in serious jeopardy the economic system of a region as well as of a country. At the same time, the monitoring of structures is a central topic of discussion in Europe, where the preservation of historical buildings has been addressed over the last four centuries. More recently, various concerns arose about security performance of civil structures after tragic events such the 9/11 or the 2011 Japan earthquake: engineers looks for a design able to resist exceptional loadings due to earthquakes, hurricanes and terrorist attacks. After events of such a kind, the assessment of the remaining life of the structure is at least as important as the initial performance design. Consequently, it appears very clear that the introduction of reliable and accessible damage assessment techniques is crucial for the localization of issues and for a correct and immediate rehabilitation. The System Identification is a branch of the more general Control Theory. In Civil Engineering, this field addresses the techniques needed to find mechanical characteristics as the stiffness or the mass starting from the signals captured by sensors. The objective of the Dynamic Structural Identification (DSI) is to define, starting from experimental measurements, the modal fundamental parameters of a generic structure in order to characterize, via a mathematical model, the dynamic behavior. The knowledge of these parameters is helpful in the Model Updating procedure, that permits to define corrected theoretical models through experimental validation. The main aim of this technique is to minimize the differences between the theoretical model results and in situ measurements of dynamic data. Therefore, the new model becomes a very effective control practice when it comes to rehabilitation of structures or damage assessment. The instrumentation of a whole structure is an unfeasible procedure sometimes because of the high cost involved or, sometimes, because it’s not possible to physically reach each point of the structure. Therefore, numerous scholars have been trying to address this problem. In general two are the main involved methods. Since the limited number of sensors, in a first case, it’s possible to gather time histories only for some locations, then to move the instruments to another location and replay the procedure. Otherwise, if the number of sensors is enough and the structure does not present a complicate geometry, it’s usually sufficient to detect only the principal first modes. This two problems are well presented in the works of Balsamo [1] for the application to a simple system and Jun [2] for the analysis of system with a limited number of sensors. Once the system identification has been carried, it is possible to access the actual system characteristics. A frequent practice is to create an updated FEM model and assess whether the structure fulfills or not the requested functions. Once again the objective of this work is to present a general methodology to analyze big structure using a limited number of instrumentation and at the same time, obtaining the most information about an identified structure without recalling methodologies of difficult interpretation. A general framework of the state space identification procedure via OKID/ERA algorithm is developed and implemented in Matlab. Then, some simple examples are proposed to highlight the principal characteristics and advantage of this methodology. A new algebraic manipulation for a prolific use of substructuring results is developed and implemented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data Distribution Management (DDM) is a core part of High Level Architecture standard, as its goal is to optimize the resources used by simulation environments to exchange data. It has to filter and match the set of information generated during a simulation, so that each federate, that is a simulation entity, only receives the information it needs. It is important that this is done quickly and to the best in order to get better performances and avoiding the transmission of irrelevant data, otherwise network resources may saturate quickly. The main topic of this thesis is the implementation of a super partes DDM testbed. It evaluates the goodness of DDM approaches, of all kinds. In fact it supports both region and grid based approaches, and it may support other different methods still unknown too. It uses three factors to rank them: execution time, memory and distance from the optimal solution. A prearranged set of instances is already available, but we also allow the creation of instances with user-provided parameters. This is how this thesis is structured. We start introducing what DDM and HLA are and what do they do in details. Then in the first chapter we describe the state of the art, providing an overview of the most well known resolution approaches and the pseudocode of the most interesting ones. The third chapter describes how the testbed we implemented is structured. In the fourth chapter we expose and compare the results we got from the execution of four approaches we have implemented. The result of the work described in this thesis can be downloaded on sourceforge using the following link: https://sourceforge.net/projects/ddmtestbed/. It is licensed under the GNU General Public License version 3.0 (GPLv3).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Every year, thousand of surgical treatments are performed in order to fix up or completely substitute, where possible, organs or tissues affected by degenerative diseases. Patients with these kind of illnesses stay long times waiting for a donor that could replace, in a short time, the damaged organ or the tissue. The lack of biological alternates, related to conventional surgical treatments as autografts, allografts, e xenografts, led the researchers belonging to different areas to collaborate to find out innovative solutions. This research brought to a new discipline able to merge molecular biology, biomaterial, engineering, biomechanics and, recently, design and architecture knowledges. This discipline is named Tissue Engineering (TE) and it represents a step forward towards the substitutive or regenerative medicine. One of the major challenge of the TE is to design and develop, using a biomimetic approach, an artificial 3D anatomy scaffold, suitable for cells adhesion that are able to proliferate and differentiate themselves as consequence of the biological and biophysical stimulus offered by the specific tissue to be replaced. Nowadays, powerful instruments allow to perform analysis day by day more accurateand defined on patients that need more precise diagnosis and treatments.Starting from patient specific information provided by TC (Computed Tomography) microCT and MRI(Magnetic Resonance Imaging), an image-based approach can be performed in order to reconstruct the site to be replaced. With the aid of the recent Additive Manufacturing techniques that allow to print tridimensional objects with sub millimetric precision, it is now possible to practice an almost complete control of the parametrical characteristics of the scaffold: this is the way to achieve a correct cellular regeneration. In this work, we focalize the attention on a branch of TE known as Bone TE, whose the bone is main subject. Bone TE combines osteoconductive and morphological aspects of the scaffold, whose main properties are pore diameter, structure porosity and interconnectivity. The realization of the ideal values of these parameters represents the main goal of this work: here we'll a create simple and interactive biomimetic design process based on 3D CAD modeling and generative algorithmsthat provide a way to control the main properties and to create a structure morphologically similar to the cancellous bone. Two different typologies of scaffold will be compared: the first is based on Triply Periodic MinimalSurface (T.P.M.S.) whose basic crystalline geometries are nowadays used for Bone TE scaffolding; the second is based on using Voronoi's diagrams and they are more often used in the design of decorations and jewellery for their capacity to decompose and tasselate a volumetric space using an heterogeneous spatial distribution (often frequent in nature). In this work, we will show how to manipulate the main properties (pore diameter, structure porosity and interconnectivity) of the design TE oriented scaffolding using the implementation of generative algorithms: "bringing back the nature to the nature".

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays the environmental issues and the climatic change play fundamental roles in the design of urban spaces. Our cities are growing in size, many times only following immediate needs without a long-term vision. Consequently, the sustainable development has become not only an ethical but also a strategic need: we can no longer afford an uncontrolled urban expansion. One serious effect of the territory industrialisation process is the increase of urban air and surfaces temperatures compared to the outlying rural surroundings. This difference in temperature is what constitutes an urban heat island (UHI). The purpose of this study is to provide a clarification on the role of urban surfacing materials in the thermal dynamics of an urban space, resulting in useful indications and advices in mitigating UHI. With this aim, 4 coloured concrete bricks were tested, measuring their emissivity and building up their heat release curves using infrared thermography. Two emissivity evaluation procedures were carried out and subsequently put in comparison. Samples performances were assessed, and the influence of the colour on the thermal behaviour was investigated. In addition, some external pavements were analysed. Albedo and emissivity parameters were evaluated in order to understand their thermal behaviour in different conditions. Surfaces temperatures were recorded in a one-day measurements campaign. ENVI-met software was used to simulate how the tested materials would behave in two typical urban scenarios: a urban canyon and a urban heat basin. Improvements they can carry to the urban microclimate were investigated. Emissivities obtained for the bricks ranged between 0.92 and 0.97, suggesting a limited influence of the colour on this parameter. Nonetheless, white concrete brick showed the best thermal performance, whilst the black one the worst; red and yellow ones performed pretty identical intermediate trends. De facto, colours affected the overall thermal behaviour. Emissivity parameter was measured in the outdoor work, getting (as expected) high values for the asphalts. Albedo measurements, conducted with a sunshine pyranometer, proved the improving effect given by the yellow paint in terms of solar reflection, and the bad influence of haze on the measurement accuracy. ENVI-met simulations gave a demonstration on the effectiveness in thermal improving of some tested materials. In particular, results showed good performances for white bricks and granite in the heat basin scenario, and painted concrete and macadam in the urban canyon scenario. These materials can be considered valuable solutions in UHI mitigation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent studies found that soil-atmosphere coupling features, through soil moisture, have been crucial to simulate well heat waves amplitude, duration and intensity. Moreover, it was found that soil moisture depletion both in Winter and Spring anticipates strong heat waves during the Summer. Irrigation in geophysical studies can be intended as an anthropogenic forcing to the soil-moisture, besides changes in land proprieties. In this study, the irrigation was add to a LAM hydrostatic model (BOLAM) and coupled with the soil. The response of the model to irrigation perturbation is analyzed during a dry Summer season. To identify a dry Summer, with overall positive temperature anomalies, an extensive climatological characterization of 2015 was done. The method included a statistical validation on the reference period distribution used to calculate the anomalies. Drought conditions were observed during Summer 2015 and previous seasons, both on the analyzed region and the Alps. Moreover July was characterized as an extreme event for the referred distribution. The numerical simulation consisted on the summer season of 2015 and two run: a control run (CTR), with the soil coupling and a perturbed run (IPR). The perturbation consists on a mask of land use created from the Cropland FAO dataset, where an irrigation water flux of 3 mm/day was applied from 6 A.M. to 9 A.M. every day. The results show that differences between CTR and IPR has a strong daily cycle. The main modifications are on the air masses proprieties, not on to the dynamics. However, changes in the circulation at the boundaries of the Po Valley are observed, and a diagnostic spatial correlation of variable differences shows that soil moisture perturbation explains well the variation observed in the 2 meters height temperature and in the latent heat fluxes.On the other hand, does not explain the spatial shift up and downslope observed during different periods of the day. Given the results, irrigation process affects the atmospheric proprieties on a larger scale than the irrigation, therefore it is important in daily forecast, particularly during hot and dry periods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this thesis is to analyse the spatial and temporal variability of the aragonite saturation state (ΩAR), commonly used as an indicator of ocean acidification, in the North-East Atlantic. When the aragonite saturation state decreases below a certain threshold, ΩAR <1, calcifying organisms (i.e. molluscs, pteropods, foraminifera, crabs, etc.) are subject to dissolution of shells and aragonite structures. This objective agrees with the challenge 'Ocean, climate change and acidification' of the EU COST Ocean Governance for Sustainability project, which aims to combine the information collected on the state of health of the oceans. Two open-sources data products, EMODnet and GLODAPv2, have been integrated and analysed for the first time in the North-East Atlantic region. The integrated dataset contains 1038 ΩAR vertical profiles whose time distribution spans from 1970 to 2014. The ΩAR has been computed from CO2SYS software considering different combinations of input parameters, pH, Total Alkalinity (TAlk) and Dissolved Inorganic Carbon (DIC), associated with Temperature, Salinity and Pressure at in situ conditions. A sensitivity analysis has been performed to better understand the data consistency of ΩAR computed from the different combinations of pH, Talk and DIC and to verify the difference among observed TAlk and DIC parameters and their output values from the CO2SYS tool. Maps of ΩAR have been computed with the best data coverage obtained from the two datasets, at different levels of depth in the area of investigation and they have been compared to the work of Jiang et al. (2015). The results are consistent and show similar horizontal and vertical patterns. The study highlights some aragonite undersaturated values (ΩAR <1) below 500 meters depth, suggesting a potential effect of acidification in the considered time period. This thesis aims to be a preliminary work for future studies that will be able to design the ΩAR variability on a decadal distribution based on the extended time-series acquired in this work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The dissertation starts by providing a description of the phenomena related to the increasing importance recently acquired by satellite applications. The spread of such technology comes with implications, such as an increase in maintenance cost, from which derives the interest in developing advanced techniques that favor an augmented autonomy of spacecrafts in health monitoring. Machine learning techniques are widely employed to lay a foundation for effective systems specialized in fault detection by examining telemetry data. Telemetry consists of a considerable amount of information; therefore, the adopted algorithms must be able to handle multivariate data while facing the limitations imposed by on-board hardware features. In the framework of outlier detection, the dissertation addresses the topic of unsupervised machine learning methods. In the unsupervised scenario, lack of prior knowledge of the data behavior is assumed. In the specific, two models are brought to attention, namely Local Outlier Factor and One-Class Support Vector Machines. Their performances are compared in terms of both the achieved prediction accuracy and the equivalent computational cost. Both models are trained and tested upon the same sets of time series data in a variety of settings, finalized at gaining insights on the effect of the increase in dimensionality. The obtained results allow to claim that both models, combined with a proper tuning of their characteristic parameters, successfully comply with the role of outlier detectors in multivariate time series data. Nevertheless, under this specific context, Local Outlier Factor results to be outperforming One-Class SVM, in that it proves to be more stable over a wider range of input parameter values. This property is especially valuable in unsupervised learning since it suggests that the model is keen to adapting to unforeseen patterns.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis describes a study conducted for the development of a new approach for the design of compliant mechanisms. Currently compliant mechanisms are based on a 2.5D design method. The applications for which compliant mechanisms can be used this way, is limited. The proposed research suggests to use a 3D approach for the design of CM’s, to better exploit its useful properties. To test the viability of this method, a practical application was chosen. The selected application is related to morphing wings. During this project a working prototype of a variable sweep and variable AoA system was designed and made for an SUAV. A compliant hinge allows the system to achieve two DOF. This hinge has been designed using the proposed 3D design approach. To validate the capabilities of the design, two methods were used. One of these methods was by simulation. By using analysis software, a basic idea could be provided of the stress and deformation of the designed mechanism. The second validation was done by means of AM. Using FDM and material jetting technologies, several prototypes were manufactured. The result of the first model showed that the DOF could be achieved. Models manufactured using material jetting technology, proved that the designed model could provide the desired motion and exploit the positive characteristics of CM. The system could be manufactured successfully in one part. Being able to produce the system in one part makes the need for an extensive assembly process redundant. This improves its structural quality. The materials chosen for the prototypes were PLA, VeroGray and Rigur. The material properties were suboptimal for its final purpose, but successful results were obtained. The prototypes proved tough and were able to provide the desired motion. This proves that the proposed design method can be a useful tool for the design of improved CM’s. Furthermore, the variable sweep & AoA system could be used to boost the flight performance of SUAV’s.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis project studies the agent identity privacy problem in the scalar linear quadratic Gaussian (LQG) control system. For the agent identity privacy problem in the LQG control, privacy models and privacy measures have to be established first. It depends on a trajectory of correlated data rather than a single observation. I propose here privacy models and the corresponding privacy measures by taking into account the two characteristics. The agent identity is a binary hypothesis: Agent A or Agent B. An eavesdropper is assumed to make a hypothesis testing on the agent identity based on the intercepted environment state sequence. The privacy risk is measured by the Kullback-Leibler divergence between the probability distributions of state sequences under two hypotheses. By taking into account both the accumulative control reward and privacy risk, an optimization problem of the policy of Agent B is formulated. The optimal deterministic privacy-preserving LQG policy of Agent B is a linear mapping. A sufficient condition is given to guarantee that the optimal deterministic privacy-preserving policy is time-invariant in the asymptotic regime. An independent Gaussian random variable cannot improve the performance of Agent B. The numerical experiments justify the theoretic results and illustrate the reward-privacy trade-off. Based on the privacy model and the LQG control model, I have formulated the mathematical problems for the agent identity privacy problem in LQG. The formulated problems address the two design objectives: to maximize the control reward and to minimize the privacy risk. I have conducted theoretic analysis on the LQG control policy in the agent identity privacy problem and the trade-off between the control reward and the privacy risk.Finally, the theoretic results are justified by numerical experiments. From the numerical results, I expected to have some interesting observations and insights, which are explained in the last chapter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In collaboration with G.D. SpA I attended an internship with the purpose of developing a filter for the position control of industrial machines during testing and maintenance operations. The filter elaborates a signal in position provided by an electonic handwheel, in order to enable the application to be controlled with a signal in velocity with arbitrarily dynamics chosen during the design phase. Limiting the dynamics of the filter provide a more stable and less demanding reference trajectory which reduce the vibrations and tracking errors of the motor controlled by it. It also prevents misusages of the handwheel from the technician which could end up in harmful interferences between the mechanical parts moved by the handwheel.