967 resultados para Alternative system
Resumo:
Objectives. The purpose of this study was to evaluate the reactivity and polymerization kinetics behavior of a model dental adhesive resin with water-soluble initiator systems. Methods. A monomer blend based on Bis-GMA, TEGDMA and HEMA was used as a model dental adhesive resin, which was polymerized using a thioxanthone type (QTX) as a photoinitiator. Binary and ternary photoinitiator systems were formulated using 1 mol% of each initiator. The co-initiators used in this study were ethyl 4-dimethylaminobenzoate (EDAB), diphenyliodonium hexafluorophosphate (DPIHFP), 1,3-diethyl-2-thiobarbituric acid (BARB), p-toluenesulfinic acid and sodium salt hydrate (SULF). Absorption spectra of the initiators were measured using a UV-Vis spectrophotometer, and the photon absorption energy (PAE) was calculated. The binary system camphorquinone (CQ)/amine was used as a reference group (control). Twelve groups were tested in triplicate. Fourier-transform infrared spectroscopy (FTIR) was used to investigate the polymerization reaction during the photoactivation period to obtain the degree of conversion (DC) and maximum polymerization rate (R-p(max)) profile of the model resin. Results. In the analyzed absorption profiles, the absorption spectrum of QTX is almost entirely localized in the UV region, whereas that of CQ is in the visible range. With respect to binary systems, CQ + EDAB exhibited higher DC and R-p(max) values. In formulations that contained ternary initiator systems, the group CQ + QTX + EDAB was the only one of the investigated experimental groups that exhibited an R-p(max) value greater than that of CQ + EDAB. The groups QTX + EDAB + DPIHFP and QTX + DPIHFP + SULF exhibited values similar to those of CQ + EDAB with respect to the final DC; however, they also exhibited lower reactivity. Significance. Water-soluble initiator systems should be considered as alternatives to the widely used CQ/amine system in dentin adhesive formulations. (C) 2012 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
Resumo:
Purpose: Duplex system is one of the most common anomalies of upper urinary tract. Anatomical and clinical presentation determine its treatment. Usually, the upper moiety has a poor function and requires resection, but when it is not significantly impaired, preservation is recommended. Laparoscopic reconstruction with upper pole preservation is presented as an alternative treatment. Materials and Methods: Four female patients with duplex system, one presenting with recurrent urinary tract infection and the others with urinary incontinence associated to infrasphincteric ectopic ureter, were treated. Surgical procedure envolved a laparoscopic ureteropyeloanastomosis of the upper pole ureter to the pelvis of the lower moiety, with prior insertion of a double J stent. Results: Surgical time varied from 120 to 150 minutes, with minimal blood loss in all cases. Follow-up varied from 15 to 30 months, with resolution of the clinical symptoms and preservation of the upper moiety function. Conclusion: Laparoscopic ureteropyeloanatomosis is a feasible and safe minimally invasive option in the treatment of duplex system.
Resumo:
The objective of this study was to evaluate the methodology to establish the hemolytic activity of alternative complement pathway as an indicator of the innate immunity in Brazilian fish pacu (Piaractus mesopotamicus), in addition to verifying the influence of beta-glucan as an immunostimulant. Fish were fed with diets containing 0, 0.1 and 1% beta-glucan, during seven days, and then inoculated with Aeromonas hydrophila. Seven days after the challenge, they were bled for serum extraction. The methodology consisted of a kinetic assay that allows calculating the required time for serum proteins of the complement to promote 50% lysis of a rabbit red blood cell suspension. The method developed in mammals was successfully applied for pacu and determined that the hemolytic activity of the proteins of the complement system (alternative pathway) increased after the pathogen challenge, but was not influenced by the beta-glucan treatment.
Resumo:
Stochastic methods based on time-series modeling combined with geostatistics can be useful tools to describe the variability of water-table levels in time and space and to account for uncertainty. Monitoring water-level networks can give information about the dynamic of the aquifer domain in both dimensions. Time-series modeling is an elegant way to treat monitoring data without the complexity of physical mechanistic models. Time-series model predictions can be interpolated spatially, with the spatial differences in water-table dynamics determined by the spatial variation in the system properties and the temporal variation driven by the dynamics of the inputs into the system. An integration of stochastic methods is presented, based on time-series modeling and geostatistics as a framework to predict water levels for decision making in groundwater management and land-use planning. The methodology is applied in a case study in a Guarani Aquifer System (GAS) outcrop area located in the southeastern part of Brazil. Communication of results in a clear and understandable form, via simulated scenarios, is discussed as an alternative, when translating scientific knowledge into applications of stochastic hydrogeology in large aquifers with limited monitoring network coverage like the GAS.
Resumo:
We performed the initial assessment of an alternative pressurized intraventilated (PIV) caging system for laboratory mice that uses direct-current microfans to achieve cage pressurization and ventilation. Twenty-nine pairs of female SPF BALB/c mice were used, with 19 experimental pairs kept in Ply cages and 10 control pairs kept in regular filter-top (FT) cages. Both groups were housed in a standard housing room with a conventional atmospheric control system. For both systems, intracage temperatures were in equilibrium with ambient room temperature. PIV cages showed a significant difference in pressure between days 1 and 8. Air speed (and consequently airflow rate) and the number of air changes hourly in the PIV cages showed decreasing trends. In both systems, ammonia concentrations increased with time, with significant differences between groups starting on day 1. Overall, the data revealed that intracage pressurization and ventilation by using microfans is a simple, reliable system, with low cost, maintenance requirements, and incidence of failures. Further experiments are needed to determine the potential influence of this system on the reproductive performance and pulmonary integrity in mice.
Resumo:
The Primary Care Information System (SIAB) concentrates basic healthcare information from all different regions of Brazil. The information is collected by primary care teams on a paper-based procedure that degrades the quality of information provided to the healthcare authorities and slows down the process of decision making. To overcome these problems we propose a new data gathering application that uses a mobile device connected to a 3G network and a GPS to be used by the primary care teams for collecting the families' data. A prototype was developed in which a digital version of one SIAB form is made available at the mobile device. The prototype was tested in a basic healthcare unit located in a suburb of Sao Paulo. The results obtained so far have shown that the proposed process is a better alternative for data collecting at primary care, both in terms of data quality and lower deployment time to health care authorities.
Resumo:
Background: Translational errors can result in bypassing of the main viral protein reading frames and the production of alternate reading frame (ARF) or cryptic peptides. Within HIV, there are many such ARFs in both sense and the antisense directions of transcription. These ARFs have the potential to generate immunogenic peptides called cryptic epitopes (CE). Both antiretroviral drug therapy and the immune system exert a mutational pressure on HIV-1. Immune pressure exerted by ARF CD8(+) T cells on the virus has already been observed in vitro. HAART has also been described to select HIV-1 variants for drug escape mutations. Since the mutational pressure exerted on one location of the HIV-1 genome can potentially affect the 3 reading frames, we hypothesized that ARF responses would be affected by this drug pressure in vivo. Methodology/Principal findings: In this study we identified new ARFs derived from sense and antisense transcription of HIV-1. Many of these ARFs are detectable in circulating viral proteins. They are predominantly found in the HIV-1 env nucleotide region. We measured T cell responses to 199 HIV-1 CE encoded within 13 sense and 34 antisense HIV-1 ARFs. We were able to observe that these ARF responses are more frequent and of greater magnitude in chronically infected individuals compared to acutely infected patients, and in patients on HAART, the breadth of ARF responses increased. Conclusions/Significance: These results have implications for vaccine design and unveil the existence of potential new epitopes that could be included as vaccine targets.
Resumo:
Evaluating the activity of the complement system under conditions of altered thyroid hormone levels might help elucidate the role of complement in triggering autoimmune processes. Here, we investigated alternative pathway (AP) activity in male Wistar rats (180 ± 10 g) after altering their thyroid hormone levels by treatment with triiodothyronine (T3), propylthiouracil (PTU) or thyroidectomy. T3 and thyroxine (T4) levels were determined by chemiluminescence assays. Hemolytic assays were performed to evaluate the lytic activity of the AP. Factor B activity was evaluated using factor B-deficient serum. An anti-human factor B antibody was used to measure factor B levels in serum by radial immunodiffusion. T3 measurements in thyroidectomized animals or animals treated with PTU demonstrated a significant reduction in hormone levels compared to control. The results showed a reduction in AP lytic activity in rats treated with increasing amounts of T3 (1, 10, or 50 µg). Factor B activity was also decreased in the sera of hyperthyroid rats treated with 1 to 50 µg T3. Additionally, treating rats with 25 µg T3 significantly increased factor B levels in their sera (P < 0.01). In contrast, increased factor B concentration and activity (32%) were observed in hypothyroid rats. We conclude that alterations in thyroid hormone levels affect the activity of the AP and factor B, which may in turn affect the roles of AP and factor B in antibody production.
Resumo:
Wood is a material of great applicability in construction, with advantageous properties to form various structural systems, such as walls and roof. Most of the roof structural systems follow models that have remained unchanged for a long time. A roof modular system in distinguished materials is proposed: reforested wood (Pine), oriented strand board (OSB) and roof tiles made of recycled long-life packaging material in order to be applied in rural construction. In this alternative, besides the benefit of giving destination packages with long-life thermal comfort, it also highlights the use of reforestated wood being the cultivation of such species that provides incentive for agribusiness. The structural performance of this alternative was evaluated through computer modeling and test results of two modular panels. The analysis is based on the results of vertical displacements, deformations and stresses. A positive correlation between theoretical and experimental values was observed, indicating the model's feasibility for use in roof structures. Therefore, the modular system represents a solution to new architecture conceptions to rural construction, for example, storage construction, cattle handling and poultry, with benefits provided by prefabricated building systems.
Resumo:
[EN] Background This study aims to design an empirical test on the sensitivity of the prescribing doctors to the price afforded for the patient, and to apply it to the population data of primary care dispensations for cardiovascular disease and mental illness in the Spanish National Health System (NHS). Implications for drug policies are discussed. Methods We used population data of 17 therapeutic groups of cardiovascular and mental illness drugs aggregated by health areas to obtain 1424 observations ((8 cardiovascular groups * 70 areas) + (9 psychotropics groups * 96 areas)). All drugs are free for pensioners. For non-pensioner patients 10 of the 17 therapeutic groups have a reduced copayment (RC) status of only 10% of the price with a ceiling of €2.64 per pack, while the remaining 7 groups have a full copayment (FC) rate of 40%. Differences in the average price among dispensations for pensioners and non-pensioners were modelled with multilevel regression models to test the following hypothesis: 1) in FC drugs there is a significant positive difference between the average prices of drugs prescribed to pensioners and non-pensioners; 2) in RC drugs there is no significant price differential between pensioner and non-pensioner patients; 3) the price differential of FC drugs prescribed to pensioners and non-pensioners is greater the higher the price of the drugs. Results The average monthly price of dispensations to pensioners and non-pensioners does not differ for RC drugs, but for FC drugs pensioners get more expensive dispensations than non-pensioners (estimated difference of €9.74 by DDD and month). There is a positive and significant effect of the drug price on the differential price between pensioners and non-pensioners. For FC drugs, each additional euro of the drug price increases the differential by nearly half a euro (0.492). We did not find any significant differences in the intensity of the price effect among FC therapeutic groups. Conclusions Doctors working in the Spanish NHS seem to be sensitive to the price that can be afforded by patients when they fill in prescriptions, although alternative hypothesis could also explain the results found.
Resumo:
Máster en Oceanografía
Resumo:
Two major types of B cells, the antibody-producing cells of the immune system, are classically distinguished in the spleen: marginal zone (MZ) and follicular (FO). In addition, FO B cells are subdivided into FO I and FO II cells, based on the amount of surface IgM. MZ B cells, which surround the splenic follicles, rapidly produce IgM in response to blood-borne pathogens without T cell help, while T cell-dependent production of high affinity, isotype-switched antibodies is ascribed to FO I cells. The significance of FO II cells and the mechanism underlying B cell fate choices are unclear. We showed that FO II cells express more Sca1 than FO I cells and originate from a distinct B cell development program, marked by high expression of Sca1. MZ B cells can derive from the “canonical” Sca1lo pathways, as well as from the Sca1hi program, although the Sca1hi program shows a stronger MZ bias than the Sca1lo program, and extensive phenotypic plasticity exists between MZ and FO II, but not between MZ and FO I cells. The Sca1hi program is induced by hematopoietic stress and generates B cells with an Igλ-enriched repertoire. In aged mice, the canonical B cell development pathway is impaired, while the Sca1hi program is increased. Furthermore, we showed that a population of unknown function, defined as Lin-c-kit+Sca1+ (LSK-), contains early lymphoid precursors, with primarily B cell potential in vivo. Our data suggest that LSK- cells may represent a distinct precursor for the Sca1hi program in the bone marrow.
Resumo:
The research activity carried out during the PhD course in Electrical Engineering belongs to the branch of electric and electronic measurements. The main subject of the present thesis is a distributed measurement system to be installed in Medium Voltage power networks, as well as the method developed to analyze data acquired by the measurement system itself and to monitor power quality. In chapter 2 the increasing interest towards power quality in electrical systems is illustrated, by reporting the international research activity inherent to the problem and the relevant standards and guidelines emitted. The aspect of the quality of voltage provided by utilities and influenced by customers in the various points of a network came out only in recent years, in particular as a consequence of the energy market liberalization. Usually, the concept of quality of the delivered energy has been associated mostly to its continuity. Hence the reliability was the main characteristic to be ensured for power systems. Nowadays, the number and duration of interruptions are the “quality indicators” commonly perceived by most customers; for this reason, a short section is dedicated also to network reliability and its regulation. In this contest it should be noted that although the measurement system developed during the research activity belongs to the field of power quality evaluation systems, the information registered in real time by its remote stations can be used to improve the system reliability too. Given the vast scenario of power quality degrading phenomena that usually can occur in distribution networks, the study has been focused on electromagnetic transients affecting line voltages. The outcome of such a study has been the design and realization of a distributed measurement system which continuously monitor the phase signals in different points of a network, detect the occurrence of transients superposed to the fundamental steady state component and register the time of occurrence of such events. The data set is finally used to locate the source of the transient disturbance propagating along the network lines. Most of the oscillatory transients affecting line voltages are due to faults occurring in any point of the distribution system and have to be seen before protection equipment intervention. An important conclusion is that the method can improve the monitored network reliability, since the knowledge of the location of a fault allows the energy manager to reduce as much as possible both the area of the network to be disconnected for protection purposes and the time spent by technical staff to recover the abnormal condition and/or the damage. The part of the thesis presenting the results of such a study and activity is structured as follows: chapter 3 deals with the propagation of electromagnetic transients in power systems by defining characteristics and causes of the phenomena and briefly reporting the theory and approaches used to study transients propagation. Then the state of the art concerning methods to detect and locate faults in distribution networks is presented. Finally the attention is paid on the particular technique adopted for the same purpose during the thesis, and the methods developed on the basis of such approach. Chapter 4 reports the configuration of the distribution networks on which the fault location method has been applied by means of simulations as well as the results obtained case by case. In this way the performance featured by the location procedure firstly in ideal then in realistic operating conditions are tested. In chapter 5 the measurement system designed to implement the transients detection and fault location method is presented. The hardware belonging to the measurement chain of every acquisition channel in remote stations is described. Then, the global measurement system is characterized by considering the non ideal aspects of each device that can concur to the final combined uncertainty on the estimated position of the fault in the network under test. Finally, such parameter is computed according to the Guide to the Expression of Uncertainty in Measurements, by means of a numeric procedure. In the last chapter a device is described that has been designed and realized during the PhD activity aiming at substituting the commercial capacitive voltage divider belonging to the conditioning block of the measurement chain. Such a study has been carried out aiming at providing an alternative to the used transducer that could feature equivalent performance and lower cost. In this way, the economical impact of the investment associated to the whole measurement system would be significantly reduced, making the method application much more feasible.
Resumo:
Providing support for multimedia applications on low-power mobile devices remains a significant research challenge. This is primarily due to two reasons: • Portable mobile devices have modest sizes and weights, and therefore inadequate resources, low CPU processing power, reduced display capabilities, limited memory and battery lifetimes as compared to desktop and laptop systems. • On the other hand, multimedia applications tend to have distinctive QoS and processing requirementswhichmake themextremely resource-demanding. This innate conflict introduces key research challenges in the design of multimedia applications and device-level power optimization. Energy efficiency in this kind of platforms can be achieved only via a synergistic hardware and software approach. In fact, while System-on-Chips are more and more programmable thus providing functional flexibility, hardwareonly power reduction techniques cannot maintain consumption under acceptable bounds. It is well understood both in research and industry that system configuration andmanagement cannot be controlled efficiently only relying on low-level firmware and hardware drivers. In fact, at this level there is lack of information about user application activity and consequently about the impact of power management decision on QoS. Even though operating system support and integration is a requirement for effective performance and energy management, more effective and QoSsensitive power management is possible if power awareness and hardware configuration control strategies are tightly integratedwith domain-specificmiddleware services. The main objective of this PhD research has been the exploration and the integration of amiddleware-centric energymanagement with applications and operating-system. We choose to focus on the CPU-memory and the video subsystems, since they are the most power-hungry components of an embedded system. A second main objective has been the definition and implementation of software facilities (like toolkits, API, and run-time engines) in order to improve programmability and performance efficiency of such platforms. Enhancing energy efficiency and programmability ofmodernMulti-Processor System-on-Chips (MPSoCs) Consumer applications are characterized by tight time-to-market constraints and extreme cost sensitivity. The software that runs on modern embedded systems must be high performance, real time, and even more important low power. Although much progress has been made on these problems, much remains to be done. Multi-processor System-on-Chip (MPSoC) are increasingly popular platforms for high performance embedded applications. This leads to interesting challenges in software development since efficient software development is a major issue for MPSoc designers. An important step in deploying applications on multiprocessors is to allocate and schedule concurrent tasks to the processing and communication resources of the platform. The problem of allocating and scheduling precedenceconstrained tasks on processors in a distributed real-time system is NP-hard. There is a clear need for deployment technology that addresses thesemulti processing issues. This problem can be tackled by means of specific middleware which takes care of allocating and scheduling tasks on the different processing elements and which tries also to optimize the power consumption of the entire multiprocessor platform. This dissertation is an attempt to develop insight into efficient, flexible and optimalmethods for allocating and scheduling concurrent applications tomultiprocessor architectures. It is a well-known problem in literature: this kind of optimization problems are very complex even in much simplified variants, therefore most authors propose simplified models and heuristic approaches to solve it in reasonable time. Model simplification is often achieved by abstracting away platform implementation ”details”. As a result, optimization problems become more tractable, even reaching polynomial time complexity. Unfortunately, this approach creates an abstraction gap between the optimization model and the real HW-SW platform. The main issue with heuristic or, more in general, with incomplete search is that they introduce an optimality gap of unknown size. They provide very limited or no information on the distance between the best computed solution and the optimal one. The goal of this work is to address both abstraction and optimality gaps, formulating accurate models which accounts for a number of ”non-idealities” in real-life hardware platforms, developing novel mapping algorithms that deterministically find optimal solutions, and implementing software infrastructures required by developers to deploy applications for the targetMPSoC platforms. Energy Efficient LCDBacklightAutoregulation on Real-LifeMultimediaAp- plication Processor Despite the ever increasing advances in Liquid Crystal Display’s (LCD) technology, their power consumption is still one of the major limitations to the battery life of mobile appliances such as smart phones, portable media players, gaming and navigation devices. There is a clear trend towards the increase of LCD size to exploit the multimedia capabilities of portable devices that can receive and render high definition video and pictures. Multimedia applications running on these devices require LCD screen sizes of 2.2 to 3.5 inches andmore to display video sequences and pictures with the required quality. LCD power consumption is dependent on the backlight and pixel matrix driving circuits and is typically proportional to the panel area. As a result, the contribution is also likely to be considerable in future mobile appliances. To address this issue, companies are proposing low power technologies suitable for mobile applications supporting low power states and image control techniques. On the research side, several power saving schemes and algorithms can be found in literature. Some of them exploit software-only techniques to change the image content to reduce the power associated with the crystal polarization, some others are aimed at decreasing the backlight level while compensating the luminance reduction by compensating the user perceived quality degradation using pixel-by-pixel image processing algorithms. The major limitation of these techniques is that they rely on the CPU to perform pixel-based manipulations and their impact on CPU utilization and power consumption has not been assessed. This PhDdissertation shows an alternative approach that exploits in a smart and efficient way the hardware image processing unit almost integrated in every current multimedia application processors to implement a hardware assisted image compensation that allows dynamic scaling of the backlight with a negligible impact on QoS. The proposed approach overcomes CPU-intensive techniques by saving system power without requiring either a dedicated display technology or hardware modification. Thesis Overview The remainder of the thesis is organized as follows. The first part is focused on enhancing energy efficiency and programmability of modern Multi-Processor System-on-Chips (MPSoCs). Chapter 2 gives an overview about architectural trends in embedded systems, illustrating the principal features of new technologies and the key challenges still open. Chapter 3 presents a QoS-driven methodology for optimal allocation and frequency selection for MPSoCs. The methodology is based on functional simulation and full system power estimation. Chapter 4 targets allocation and scheduling of pipelined stream-oriented applications on top of distributed memory architectures with messaging support. We tackled the complexity of the problem by means of decomposition and no-good generation, and prove the increased computational efficiency of this approach with respect to traditional ones. Chapter 5 presents a cooperative framework to solve the allocation, scheduling and voltage/frequency selection problem to optimality for energyefficient MPSoCs, while in Chapter 6 applications with conditional task graph are taken into account. Finally Chapter 7 proposes a complete framework, called Cellflow, to help programmers in efficient software implementation on a real architecture, the Cell Broadband Engine processor. The second part is focused on energy efficient software techniques for LCD displays. Chapter 8 gives an overview about portable device display technologies, illustrating the principal features of LCD video systems and the key challenges still open. Chapter 9 shows several energy efficient software techniques present in literature, while Chapter 10 illustrates in details our method for saving significant power in an LCD panel. Finally, conclusions are drawn, reporting the main research contributions that have been discussed throughout this dissertation.
Resumo:
La necessità di sincronizzare i propri dati si presenta in una moltitudine di situazioni, infatti il numero di dispositivi informatici a nostra disposizione è in continua crescita e, all' aumentare del loro numero, cresce l' esigenza di mantenere aggiornate le multiple copie dei dati in essi memorizzati. Vi sono diversi fattori che complicano tale situazione, tra questi la varietà sempre maggiore dei sistemi operativi utilizzati nei diversi dispositivi, si parla di Microsoft Windows, delle tante distribuzioni Linux, di Mac OS X, di Solaris o di altri sistemi operativi UNIX, senza contare i sistemi operativi più orientati al settore mobile come Android. Ogni sistema operativo ha inoltre un modo particolare di gestire i dati, si pensi alla differente gestione dei permessi dei file o alla sensibilità alle maiuscole. Bisogna anche considerare che se gli aggiornamenti dei dati avvenissero soltanto su di uno di questi dispositivi sarebbe richiesta una semplice copia dei dati aggiornati sugli altri dispositivi, ma che non è sempre possibile utilizzare tale approccio. Infatti i dati vengono spesso aggiornati in maniera indipendente in più di un dispositivo, magari nello stesso momento, è pertanto necessario che le applicazioni che si occupano di sincronizzare tali dati riconoscano le situazioni di conflitto, nelle quali gli stessi dati sono stati aggiornati in più di una copia ed in maniera differente, e permettano di risolverle, uniformando lo stato delle repliche. Considerando l' importanza e il valore che possono avere i dati, sia a livello lavorativo che personale, è necessario che tali applicazioni possano garantirne la sicurezza, evitando in ogni caso un loro danneggiamento, perchè sempre più spesso il valore di un dispositivo dipende più dai dati in esso contenuti che dal costo dello hardware. In questa tesi verranno illustrate alcune idee alternative su come possa aver luogo la condivisione e la sincronizzazione di dati tra sistemi operativi diversi, sia nel caso in cui siano installati nello stesso dispositivo che tra dispositivi differenti. La prima parte della tesi descriverà nel dettaglio l' applicativo Unison. Tale applicazione, consente di mantenere sincronizzate tra di loro repliche dei dati, memorizzate in diversi dispositivi che possono anche eseguire sistemi operativi differenti. Unison funziona a livello utente, analizzando separatamente lo stato delle repliche al momento dell' esecuzione, senza cioè mantenere traccia delle operazioni che sono state effettuate sui dati per modificarli dal loro stato precedente a quello attuale. Unison permette la sincronizzazione anche quando i dati siano stati modificati in maniera indipendente su più di un dispositivo, occupandosi di risolvere gli eventuali conflitti che possono verificarsi rispettando la volontà dell' utente. Verranno messe in evidenza le strategie utilizzate dai suoi ideatori per garantire la sicurezza dei dati ad esso affidati e come queste abbiano effetto nelle più diverse condizioni. Verrà poi fornita un' analisi dettagiata di come possa essere utilizzata l' applicazione, fornendo una descrizione accurata delle funzionalità e vari esempi per renderne più chiaro il funzionamento. Nella seconda parte della tesi si descriverà invece come condividere file system tra sistemi operativi diversi all' interno della stessa macchina, si tratta di un approccio diametralmente opposto al precedente, in cui al posto di avere una singola copia dei dati, si manteneva una replica per ogni dispositivo coinvolto. Concentrando l' attenzione sui sistemi operativi Linux e Microsoft Windows verranno descritti approfonditamente gli strumenti utilizzati e illustrate le caratteristiche tecniche sottostanti.