612 resultados para Paramount


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The lattice Boltzmann method is a popular approach for simulating hydrodynamic interactions in soft matter and complex fluids. The solvent is represented on a discrete lattice whose nodes are populated by particle distributions that propagate on the discrete links between the nodes and undergo local collisions. On large length and time scales, the microdynamics leads to a hydrodynamic flow field that satisfies the Navier-Stokes equation. In this thesis, several extensions to the lattice Boltzmann method are developed. In complex fluids, for example suspensions, Brownian motion of the solutes is of paramount importance. However, it can not be simulated with the original lattice Boltzmann method because the dynamics is completely deterministic. It is possible, though, to introduce thermal fluctuations in order to reproduce the equations of fluctuating hydrodynamics. In this work, a generalized lattice gas model is used to systematically derive the fluctuating lattice Boltzmann equation from statistical mechanics principles. The stochastic part of the dynamics is interpreted as a Monte Carlo process, which is then required to satisfy the condition of detailed balance. This leads to an expression for the thermal fluctuations which implies that it is essential to thermalize all degrees of freedom of the system, including the kinetic modes. The new formalism guarantees that the fluctuating lattice Boltzmann equation is simultaneously consistent with both fluctuating hydrodynamics and statistical mechanics. This establishes a foundation for future extensions, such as the treatment of multi-phase and thermal flows. An important range of applications for the lattice Boltzmann method is formed by microfluidics. Fostered by the "lab-on-a-chip" paradigm, there is an increasing need for computer simulations which are able to complement the achievements of theory and experiment. Microfluidic systems are characterized by a large surface-to-volume ratio and, therefore, boundary conditions are of special relevance. On the microscale, the standard no-slip boundary condition used in hydrodynamics has to be replaced by a slip boundary condition. In this work, a boundary condition for lattice Boltzmann is constructed that allows the slip length to be tuned by a single model parameter. Furthermore, a conceptually new approach for constructing boundary conditions is explored, where the reduced symmetry at the boundary is explicitly incorporated into the lattice model. The lattice Boltzmann method is systematically extended to the reduced symmetry model. In the case of a Poiseuille flow in a plane channel, it is shown that a special choice of the collision operator is required to reproduce the correct flow profile. This systematic approach sheds light on the consequences of the reduced symmetry at the boundary and leads to a deeper understanding of boundary conditions in the lattice Boltzmann method. This can help to develop improved boundary conditions that lead to more accurate simulation results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modern software systems, in particular distributed ones, are everywhere around us and are at the basis of our everyday activities. Hence, guaranteeing their cor- rectness, consistency and safety is of paramount importance. Their complexity makes the verification of such properties a very challenging task. It is natural to expect that these systems are reliable and above all usable. i) In order to be reliable, compositional models of software systems need to account for consistent dynamic reconfiguration, i.e., changing at runtime the communication patterns of a program. ii) In order to be useful, compositional models of software systems need to account for interaction, which can be seen as communication patterns among components which collaborate together to achieve a common task. The aim of the Ph.D. was to develop powerful techniques based on formal methods for the verification of correctness, consistency and safety properties related to dynamic reconfiguration and communication in complex distributed systems. In particular, static analysis techniques based on types and type systems appeared to be an adequate methodology, considering their success in guaranteeing not only basic safety properties, but also more sophisticated ones like, deadlock or livelock freedom in a concurrent setting. The main contributions of this dissertation are twofold. i) On the components side: we design types and a type system for a concurrent object-oriented calculus to statically ensure consistency of dynamic reconfigurations related to modifications of communication patterns in a program during execution time. ii) On the communication side: we study advanced safety properties related to communication in complex distributed systems like deadlock-freedom, livelock- freedom and progress. Most importantly, we exploit an encoding of types and terms of a typical distributed language, session π-calculus, into the standard typed π- calculus, in order to understand their expressive power.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

During the last few decades an unprecedented technological growth has been at the center of the embedded systems design paramount, with Moore’s Law being the leading factor of this trend. Today in fact an ever increasing number of cores can be integrated on the same die, marking the transition from state-of-the-art multi-core chips to the new many-core design paradigm. Despite the extraordinarily high computing power, the complexity of many-core chips opens the door to several challenges. As a result of the increased silicon density of modern Systems-on-a-Chip (SoC), the design space exploration needed to find the best design has exploded and hardware designers are in fact facing the problem of a huge design space. Virtual Platforms have always been used to enable hardware-software co-design, but today they are facing with the huge complexity of both hardware and software systems. In this thesis two different research works on Virtual Platforms are presented: the first one is intended for the hardware developer, to easily allow complex cycle accurate simulations of many-core SoCs. The second work exploits the parallel computing power of off-the-shelf General Purpose Graphics Processing Units (GPGPUs), with the goal of an increased simulation speed. The term Virtualization can be used in the context of many-core systems not only to refer to the aforementioned hardware emulation tools (Virtual Platforms), but also for two other main purposes: 1) to help the programmer to achieve the maximum possible performance of an application, by hiding the complexity of the underlying hardware. 2) to efficiently exploit the high parallel hardware of many-core chips in environments with multiple active Virtual Machines. This thesis is focused on virtualization techniques with the goal to mitigate, and overtake when possible, some of the challenges introduced by the many-core design paradigm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the aim to provide people with sustainable options, engineers are ethically required to hold the safety, health and welfare of the public paramount and to satisfy society's need for sustainable development. The global crisis and related sustainability challenges are calling for a fundamental change in culture, structures and practices. Sustainability Transitions (ST) have been recognized as promising frameworks for radical system innovation towards sustainability. In order to enhance the effectiveness of transformative processes, both the adoption of a transdisciplinary approach and the experimentation of practices are crucial. The evolution of approaches towards ST provides a series of inspiring cases which allow to identify advances in making sustainability transitions happen. In this framework, the thesis has emphasized the role of Transition Engineering (TE). TE adopts a transdisciplinary approach for engineering to face the sustainability challenges and address the risks of un-sustainability. With this purpose, a definition of Transition Technologies is provided as a valid instruments to contribute to ST. In the empirical section, several transition initiatives have been analysed especially at the urban level. As a consequence, the model of living-lab of sustainability has crucially emerged. Living-labs are environments in which innovative technologies and services are co-created with users active participation. In this framework, university can play a key role as learning organization. The core of the thesis has concerned the experimental application of transition approach within the School of Engineering and Architecture of University of Bologna at Terracini Campus. The final vision is to realize a living-lab of sustainability. Particularly, a Transition Team has been established and several transition experiments have been conducted. The final result is not only the improvement of sustainability and resilience of the Terracini Campus, but the demonstration that university can generate solutions and strategies that tackle the complex, dynamic factors fuelling the global crisis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Resource management is of paramount importance in network scenarios and it is a long-standing and still open issue. Unfortunately, while technology and innovation continue to evolve, our network infrastructure system has been maintained almost in the same shape for decades and this phenomenon is known as “Internet ossification”. Software-Defined Networking (SDN) is an emerging paradigm in computer networking that allows a logically centralized software program to control the behavior of an entire network. This is done by decoupling the network control logic from the underlying physical routers and switches that forward traffic to the selected destination. One mechanism that allows the control plane to communicate with the data plane is OpenFlow. The network operators could write high-level control programs that specify the behavior of an entire network. Moreover, the centralized control makes it possible to define more specific and complex tasks that could involve many network functionalities, e.g., security, resource management and control, into a single framework. Nowadays, the explosive growth of real time applications that require stringent Quality of Service (QoS) guarantees, brings the network programmers to design network protocols that deliver certain performance guarantees. This thesis exploits the use of SDN in conjunction with OpenFlow to manage differentiating network services with an high QoS. Initially, we define a QoS Management and Orchestration architecture that allows us to manage the network in a modular way. Then, we provide a seamless integration between the architecture and the standard SDN paradigm following the separation between the control and data planes. This work is a first step towards the deployment of our proposal in the University of California, Los Angeles (UCLA) campus network with differentiating services and stringent QoS requirements. We also plan to exploit our solution to manage the handoff between different network technologies, e.g., Wi-Fi and WiMAX. Indeed, the model can be run with different parameters, depending on the communication protocol and can provide optimal results to be implemented on the campus network.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Understanding and controlling the mechanism of the diffusion of small molecules, macromolecules and nanoparticles in heterogeneous environments is of paramount fundamental and technological importance. The aim of the thesis is to show, how by studying the tracer diffusion in complex systems, one can obtain information about the tracer itself, and the system where the tracer is diffusing. rnIn the first part of my thesis I will introduce the Fluorescence Correlation Spectroscopy (FCS) which is a powerful tool to investigate the diffusion of fluorescent species in various environments. By using the main advantage of FCS namely the very small probing volume (<1µm3) I was able to track the kinetics of phase separation in polymer blends at late stages by looking on the molecular tracer diffusion in individual domains of the heterogeneous structure of the blend. The phase separation process at intermediate stages was monitored with laser scanning confocal microscopy (LSCM) in real time providing images of droplet coalescence and growth. rnIn a further project described in my thesis I will show that even when the length scale of the heterogeneities becomes smaller than the FCS probing volume one can still obtain important microscopic information by studying small tracer diffusion. To do so, I will introduce a system of star shaped polymer solutions and will demonstrate that the mobility of small molecular tracers on microscopic level is nearly not affected by the transition of the polymer system to a “glassy” macroscopic state. rnIn the last part of the thesis I will introduce and describe a new stimuli responsive system which I have developed, that combines two levels of nanoporosity. The system is based on poly-N-isopropylacrylamide (PNIPAM) and silica inverse opals (iOpals), and allows controlling the diffusion of tracer molecules. rn

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Gels are elastic porous polymer networks that are accompanied by pronounced mechanical properties. Due to their biocompatibility, ‘responsive hydrogels’ (HG) have many biomedical applications ranging from biosensors and drug delivery to tissue engineering. They respond to external stimuli such as temperature and salt by changing their dimensions. Of paramount importance is the ability to engineer penetrability and diffusion of interacting molecules in the crowded HG environment, as this would enable one to optimize a specific functionality. Even though the conditions under which biomedical devices operate are rather complex, a bottom-up approach could reduce the complexity of mutually coupled parameters influencing tracer mobility. The present thesis focuses on the interaction-induced tracer diffusion in polymer solutions and their homologous gels, probed by means of Fluorescence Correlation Spectroscopy (FCS). This is a single-molecule-sensitive technique having the advantage of optimal performance under ultralow tracer concentrations, typically employed in biosensors. Two different types of hydrogels have been investigated, a conventional one with broad polydispersity in the distance between crosslink points and a so-called ‘ideal’, with uniform mesh size distribution. The former is based on a thermoresponsive polymer, exhibiting phase separation in water at temperatures close to the human body temperature. The latter represents an optimal platform to study tracer diffusion. Mobilities of different tracers have been investigated in each network, varying in size, geometry and in terms of tracer-polymer attractive strength, as perturbed by different stimuli. The thesis constitutes a systematic effort towards elucidating the role of the strength and nature of different tracer-polymer interactions, on tracer mobilities; it outlines that interactions can still be very important even in the simplified case of dilute polymer solutions; it also demonstrates that the presence of permanent crosslinks exerts distinct tracer slowdown, depending on the tracer type and the nature of the tracer-polymer interactions, expressed differently by each tracer with regard to the selected stimulus. In aqueous polymer solutions, the tracer slowdown is found to be system-dependent and no universal trend seems to hold, in contrast to predictions from scaling theory for non-interacting nanoparticle mobility and empirical relations concerning the mesh size in polymer solutions. Complex tracer dynamics in polymer networks may be distinctly expressed by FCS, depending on the specific synergy among-at least some of - the following parameters: nature of interactions, external stimuli employed, tracer size and type, crosslink density and swelling ratio.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The European Community has stressed the importance of achieving a common understanding to deal with the environmental noise through community actions of the Member States. This implies the use of harmonized indicators and specific information regarding the values of indicators, the exceedance of limits and the number of people and dwellings exposed to noise. The D.Lgs. 149/2005 in compliance with the European Directive 2002/49/EC defines methodologies, noise indicators and types of outputs required. In this dissertation the work done for the noise mapping of highly trafficked roads of the Province of Bologna will be reported. The study accounts for the environmental noise generated by the road infrastructure outside the urban agglomeration of Bologna. Roads characterized by an annual traffic greater than three millions of vehicles will be considered. The process of data collection and validation will be reported, as long as the implementation of the calculation method in the software and the procedure to create and calibrate the calculation model. Results will be provided as required by the legislation, in forms of maps and tables. Moreover results regarding each road section accounted will be combined to gain a general understanding of the situation of the overall studied area. Although the understanding of the noise levels and the number of people exposed is paramount, it is not sufficient to develop strategies of noise abatement interventions. Thus a further step will be addressed: the creation of priority maps as the basis of action plans for organizing and prioritizing solutions for noise reduction and abatement. Noise reduction measures are reported in a qualitative way in the annex and constitute a preliminary research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nowadays, data handling and data analysis in High Energy Physics requires a vast amount of computational power and storage. In particular, the world-wide LHC Com- puting Grid (LCG), an infrastructure and pool of services developed and deployed by a ample community of physicists and computer scientists, has demonstrated to be a game changer in the efficiency of data analyses during Run-I at the LHC, playing a crucial role in the Higgs boson discovery. Recently, the Cloud computing paradigm is emerging and reaching a considerable adoption level by many different scientific organizations and not only. Cloud allows to access and utilize not-owned large computing resources shared among many scientific communities. Considering the challenging requirements of LHC physics in Run-II and beyond, the LHC computing community is interested in exploring Clouds and see whether they can provide a complementary approach - or even a valid alternative - to the existing technological solutions based on Grid. In the LHC community, several experiments have been adopting Cloud approaches, and in particular the experience of the CMS experiment is of relevance to this thesis. The LHC Run-II has just started, and Cloud-based solutions are already in production for CMS. However, other approaches of Cloud usage are being thought of and are at the prototype level, as the work done in this thesis. This effort is of paramount importance to be able to equip CMS with the capability to elastically and flexibly access and utilize the computing resources needed to face the challenges of Run-III and Run-IV. The main purpose of this thesis is to present forefront Cloud approaches that allow the CMS experiment to extend to on-demand resources dynamically allocated as needed. Moreover, a direct access to Cloud resources is presented as suitable use case to face up with the CMS experiment needs. Chapter 1 presents an overview of High Energy Physics at the LHC and of the CMS experience in Run-I, as well as preparation for Run-II. Chapter 2 describes the current CMS Computing Model, and Chapter 3 provides Cloud approaches pursued and used within the CMS Collaboration. Chapter 4 and Chapter 5 discuss the original and forefront work done in this thesis to develop and test working prototypes of elastic extensions of CMS computing resources on Clouds, and HEP Computing “as a Service”. The impact of such work on a benchmark CMS physics use-cases is also demonstrated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cell therapies have gained increasing interest and developed in several approaches related to the treatment of damaged myocardium. The results of multiple clinical trials have already been reported, almost exclusively involving the direct injection of stem cells. It has, however, been postulated that the efficiency of injected cells could possibly be hindered by the mechanical trauma due to the injection and their low survival in the hostile environment. It has indeed been demonstrated that cell mortality due to the injection approaches 90%. Major issues still need to be resolved and bed-to-bench followup is paramount to foster clinical implementations. The tissue engineering approach thus constitutes an attractive alternative since it provides the opportunity to deliver a large number of cells that are already organized in an extracellular matrix. Recent laboratory reports confirmed the interest of this approach and already encouraged a few groups to investigate it in clinical studies. We discuss current knowledge regarding engineered tissue for myocardial repair or replacement and in particular the recent implementation of nanotechnological approaches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ethyl glucuronide (EtG) and ethyl sulfate (EtS) are direct alcohol consumption markers widely used nowadays for clinical and forensic applications. They are detectable in blood and urine even after consumption of trace amounts of ethanol and for a longer time frame, being detectable even when no more ethanol is present. The instability of EtG against bacterial degradation in contaminated urine samples and/or the possible postcollection synthesis of this metabolite in samples containing, e.g., Escherichia coli and ethanol, may cause false identification of alcohol uptake. Therefore, it is of paramount importance to constrict these error sources by inhibition of any bacterial growth causing hydrolization or synthesis of EtG. This study evaluates a new method of collecting urine samples on filter paper, dried urine spots (DUS), for simultaneous detection of EtG, EtS and creatinine, having the great advantage of inhibiting bacterial activity. In addition, a method validation for the determination of EtG and EtS in DUS was performed according to the FDA guidelines. Sterile-filtered urine was spiked with EtG and EtS, inoculated with E. coli and incubated. Liquid and dried urine samples were collected after various time intervals up to 96 h. Liquid samples were frozen immediately after collection, whereas aliquots for DUS were pipetted onto filter paper, allowed to dry and stored at RT until analysis 1 week after. The specimens were analyzed by LC-ESI-MS/MS. As expected, degradation of EtG, but not of EtS, was observed in contaminated liquid urine samples. However, the specimens collected on filter paper and stored at RT showed no degradation during storage. Therefore, collecting urine samples on filter paper for EtG and EtS analysis turns out to be a reliable method to avoid bacterial degradation of EtG and EtS, and consequently, stabilization of these ethanol metabolites is achieved. In addition, simultaneous measurement of creatinine content as an indicator of urine dilution helps to interpret the results. Method validation for EtG and EtS in DUS was satisfactory, showing the linearity of the calibration curves in the studied concentration range, good precision, accuracy and selectivity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

REASONS FOR PERFORMING STUDY: The horse owner assessed respiratory signs index (HOARSI-1-4, healthy, mildly, moderately and severely affected, respectively) is based on owner-reported clinical history and has been used for the investigation of recurrent airway obstruction (RAO) genetics utilising large sample sizes. Reliable phenotype identification is of paramount importance in genetic studies. Owner reports of respiratory signs have shown good repeatability, but the agreement of HOARSI with an in-depth examination of the lower respiratory tract has not been investigated. OBJECTIVES: To determine the correlation of HOARSI grades 3/4 with the characteristics of RAO and of HOARSI-2 with the characteristics of inflammatory airway disease. Further, to test whether there are phenotypic differences in the manifestation of lung disease between families. METHODS: Seventy-one direct offspring of 2 RAO-affected Warmblood stallions (33 from the first family, 38 from the second) were graded as HOARSI-1-4 and underwent a clinical examination of the respiratory system, arterial blood gas analysis, endoscopic mucus scoring, cytology of tracheobronchial secretion (TBS) and bronchoalveolar lavage fluid (BALF), and clinical assessment of airway reactivity to methacholine chloride. RESULTS: HOARSI-3/4 animals in clinical exacerbation showed signs consistent with RAO: coughing, nasal discharge, abnormal lung sounds and breathing pattern as well as increased numbers of neutrophils in TBS and BALF, excessive mucus accumulation and airway hyper-responsiveness to methacholine. HOARSI-3/4 horses in remission only had increased amounts of tracheal mucus and TBS neutrophil percentages. Clinical phenotypes were not significantly different between the 2 families. CONCLUSIONS AND CLINICAL RELEVANCE: HOARSI reliably identifies RAO-affected horses in our population.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

During and after an erosive challenge, behavioral factors play a role in modifying the extent of erosive tooth wear. The manner that dietary acids are introduced into the mouth (gulping, sipping, use of a straw) will affect how long the teeth are in contact with the erosive challenge. The frequency and duration of exposure to an erosive agent is of paramount importance. Night-time exposure (e.g. baby bottle-feeding) to erosive agents may be particularly destructive because of the absence of salivary flow. Health-conscious individuals tend to ingest acidic drinks and juices more frequently and tend to have higher than average oral hygiene. While good oral hygiene is of proven value in the prevention of periodontal disease and dental caries, frequent toothbrushing with abrasive oral hygiene products may enhance erosive tooth wear. Unhealthy lifestyles such as consumption of designer drugs, alcopops and alcohol abuse are other important behavioral factors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Portfolio use in writing studies contexts is becoming ubiquitous and, as such, portfolios are in danger of being rendered meaningless and thus require that we more fully theorize and historicize portfolios. To this end, I examine portfolios: both the standardized portfolio used for assessment purposes and the personalized portfolio used for entering the job market. I take a critical look at portfolios as a form of technology and acknowledge some of the dangers of blindly using portfolios for gaining employment in the current economic structure of fast capitalism. As educators in the writing studies fields, it is paramount that instructors have a critical awareness of the consequences of portfolio creation on students as designers, lifelong learners, and citizens of a larger society. I argue that a better understanding of the pedagogical implications for portfolio use is imperative before implementing them in the classroom, and that a social-epistemic approach provides a valuable rethinking of portfolio use for assessment purposes. Further, I argue for the notions of meditation and transformation to be added alongside collection, selection, and reflection because they enable portfolio designers and evaluators alike to thoughtfully consider new ways of meaning-making and innovation. Also important and included with meditation and transformation is the understanding that students are ideologically positioned in the educational system. For them to begin recognizing their situatedness is a step toward becoming designers of change. The portfolio can be a site for that change, and a way for them to document their own learning and ways of making meaning over a lifetime.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Transient neurological dysfunction (TND) consists of postoperative confusion, delirium and agitation. It is underestimated after surgery on the thoracic aorta and its influence on long-term quality of life (QoL) has not yet been studied. This study aimed to assess the influence of TND on short- and long-term outcome following surgery of the ascending aorta and proximal arch. METHODS: Nine hundred and seven patients undergoing surgery of the ascending aorta and the proximal aortic arch at our institution were included. Two hundred and ninety patients (31.9%) underwent surgery because of acute aortic dissection type A (AADA) and 617 patients because of aortic aneurysm. In 547 patients (60.3%) the distal anastomosis was performed using deep hypothermic circulatory arrest (DHCA). TND was defined as a Glasgow coma scale (GCS) value <13. All surviving patients had a clinical follow up and QoL was assessed with an SF-36 questionnaire. RESULTS: Overall in-hospital mortality was 8.3%. TND occurred in 89 patients (9.8%). As compared to patients without TND, those who suffered from TND were older (66.4 vs 59.9 years, p<0.01) underwent more frequently emergent procedures (53% vs 32%, p<0.05) and surgery under DHCA (84.3% vs 57.7%, p<0.05). However, duration of DHCA and extent of surgery did not influence the incidence of TND. In-hospital mortality in the group of patients with TND compared to the group without TND was similar (12.0% vs 11.4%; p=ns). Patients with TND suffered more frequently from coronary artery disease (28% vs 20.8%, p=ns) and were more frequently admitted in a compromised haemodynamic condition (23.6% vs 9.9%, p<0.05). Postoperative course revealed more pulmonary complications such as prolonged mechanical ventilation. Additional to their transient neurological dysfunction, significantly more patients had strokes with permanent neurological loss of function (14.6% vs 4.8%, p<0.05) compared to the patients without TND. ICU and hospital stay were significantly prolonged in TND patients (18+/-13 days vs 12+/-7 days, p<0.05). Over a mean follow-up interval of 27+/-14 months, patients with TND showed a significantly impaired QoL. CONCLUSION: The neurological outcome following surgery of the ascending aorta and proximal aortic arch is of paramount importance. The impact of TND on short- and long-term outcome is underestimated and negatively affects the short- and long-term outcome.