853 resultados para Heterogeneous interacting-agent model
Resumo:
This thesis proposes a new document model, according to which any document can be segmented in some independent components and transformed in a pattern-based projection, that only uses a very small set of objects and composition rules. The point is that such a normalized document expresses the same fundamental information of the original one, in a simple, clear and unambiguous way. The central part of my work consists of discussing that model, investigating how a digital document can be segmented, and how a segmented version can be used to implement advanced tools of conversion. I present seven patterns which are versatile enough to capture the most relevant documents’ structures, and whose minimality and rigour make that implementation possible. The abstract model is then instantiated into an actual markup language, called IML. IML is a general and extensible language, which basically adopts an XHTML syntax, able to capture a posteriori the only content of a digital document. It is compared with other languages and proposals, in order to clarify its role and objectives. Finally, I present some systems built upon these ideas. These applications are evaluated in terms of users’ advantages, workflow improvements and impact over the overall quality of the output. In particular, they cover heterogeneous content management processes: from web editing to collaboration (IsaWiki and WikiFactory), from e-learning (IsaLearning) to professional printing (IsaPress).
Resumo:
The ideal approach for the long term treatment of intestinal disorders, such as inflammatory bowel disease (IBD), is represented by a safe and well tolerated therapy able to reduce mucosal inflammation and maintain homeostasis of the intestinal microbiota. A combined therapy with antimicrobial agents, to reduce antigenic load, and immunomodulators, to ameliorate the dysregulated responses, followed by probiotic supplementation has been proposed. Because of the complementary mechanisms of action of antibiotics and probiotics, a combined therapeutic approach would give advantages in terms of enlargement of the antimicrobial spectrum, due to the barrier effect of probiotic bacteria, and limitation of some side effects of traditional chemiotherapy (i.e. indiscriminate decrease of aggressive and protective intestinal bacteria, altered absorption of nutrient elements, allergic and inflammatory reactions). Rifaximin (4-deoxy-4’-methylpyrido[1’,2’-1,2]imidazo[5,4-c]rifamycin SV) is a product of synthesis experiments designed to modify the parent compound, rifamycin, in order to achieve low gastrointestinal absorption while retaining good antibacterial activity. Both experimental and clinical pharmacology clearly show that this compound is a non systemic antibiotic with a broad spectrum of antibacterial action, covering Gram-positive and Gram-negative organisms, both aerobes and anaerobes. Being virtually non absorbed, its bioavailability within the gastrointestinal tract is rather high with intraluminal and faecal drug concentrations that largely exceed the MIC values observed in vitro against a wide range of pathogenic microorganisms. The gastrointestinal tract represents therefore the primary therapeutic target and gastrointestinal infections the main indication. The little value of rifaximin outside the enteric area minimizes both antimicrobial resistance and systemic adverse events. Fermented dairy products enriched with probiotic bacteria have developed into one of the most successful categories of functional foods. Probiotics are defined as “live microorganisms which, when administered in adequate amounts, confer a health benefit on the host” (FAO/WHO, 2002), and mainly include Lactobacillus and Bifidobacterium species. Probiotic bacteria exert a direct effect on the intestinal microbiota of the host and contribute to organoleptic, rheological and nutritional properties of food. Administration of pharmaceutical probiotic formula has been associated with therapeutic effects in treatment of diarrhoea, constipation, flatulence, enteropathogens colonization, gastroenteritis, hypercholesterolemia, IBD, such as ulcerative colitis (UC), Crohn’s disease, pouchitis and irritable bowel syndrome. Prerequisites for probiotics are to be effective and safe. The characteristics of an effective probiotic for gastrointestinal tract disorders are tolerance to upper gastrointestinal environment (resistance to digestion by enteric or pancreatic enzymes, gastric acid and bile), adhesion on intestinal surface to lengthen the retention time, ability to prevent the adherence, establishment and/or replication of pathogens, production of antimicrobial substances, degradation of toxic catabolites by bacterial detoxifying enzymatic activities, and modulation of the host immune responses. This study was carried out using a validated three-stage fermentative continuous system and it is aimed to investigate the effect of rifaximin on the colonic microbial flora of a healthy individual, in terms of bacterial composition and production of fermentative metabolic end products. Moreover, this is the first study that investigates in vitro the impact of the simultaneous administration of the antibiotic rifaximin and the probiotic B. lactis BI07 on the intestinal microbiota. Bacterial groups of interest were evaluated using culture-based methods and molecular culture-independent techniques (FISH, PCR-DGGE). Metabolic outputs in terms of SCFA profiles were determined by HPLC analysis. Collected data demonstrated that rifaximin as well as antibiotic and probiotic treatment did not change drastically the intestinal microflora, whereas bacteria belonging to Bifidobacterium and Lactobacillus significantly increase over the course of the treatment, suggesting a spontaneous upsurge of rifaximin resistance. These results are in agreement with a previous study, in which it has been demonstrated that rifaximin administration in patients with UC, affects the host with minor variations of the intestinal microflora, and that the microbiota is restored over a wash-out period. In particular, several Bifidobacterium rifaximin resistant mutants could be isolated during the antibiotic treatment, but they disappeared after the antibiotic suspension. Furthermore, bacteria belonging to Atopobium spp. and E. rectale/Clostridium cluster XIVa increased significantly after rifaximin and probiotic treatment. Atopobium genus and E. rectale/Clostridium cluster XIVa are saccharolytic, butyrate-producing bacteria, and for these characteristics they are widely considered health-promoting microorganisms. The absence of major variations in the intestinal microflora of a healthy individual and the significant increase in probiotic and health-promoting bacteria concentrations support the rationale of the administration of rifaximin as efficacious and non-dysbiosis promoting therapy and suggest the efficacy of an antibiotic/probiotic combined treatment in several gut pathologies, such as IBD. To assess the use of an antibiotic/probiotic combination for clinical management of intestinal disorders, genetic, proteomic and physiologic approaches were employed to elucidate molecular mechanisms determining rifaximin resistance in Bifidobacterium, and the expected interactions occurring in the gut between these bacteria and the drug. The ability of an antimicrobial agent to select resistance is a relevant factor that affects its usefulness and may diminish its useful life. Rifaximin resistance phenotype was easily acquired by all bifidobacteria analyzed [type strains of the most representative intestinal bifidobacterial species (B. infantis, B. breve, B. longum, B. adolescentis and B. bifidum) and three bifidobacteria included in a pharmaceutical probiotic preparation (B. lactis BI07, B. breve BBSF and B. longum BL04)] and persisted for more than 400 bacterial generations in the absence of selective pressure. Exclusion of any reversion phenomenon suggested two hypotheses: (i) stable and immobile genetic elements encode resistance; (ii) the drug moiety does not act as an inducer of the resistance phenotype, but enables selection of resistant mutants. Since point mutations in rpoB have been indicated as representing the principal factor determining rifampicin resistance in E. coli and M. tuberculosis, whether a similar mechanism also occurs in Bifidobacterium was verified. The analysis of a 129 bp rpoB core region of several wild-type and resistant bifidobacteria revealed five different types of miss-sense mutations in codons 513, 516, 522 and 529. Position 529 was a novel mutation site, not previously described, and position 522 appeared interesting for both the double point substitutions and the heterogeneous profile of nucleotide changes. The sequence heterogeneity of codon 522 in Bifidobacterium leads to hypothesize an indirect role of its encoded amino acid in the binding with the rifaximin moiety. These results demonstrated the chromosomal nature of rifaximin resistance in Bifidobacterium, minimizing risk factors for horizontal transmission of resistance elements between intestinal microbial species. Further proteomic and physiologic investigations were carried out using B. lactis BI07, component of a pharmaceutical probiotic preparation, as a model strain. The choice of this strain was determined based on the following elements: (i) B. lactis BI07 is able to survive and persist in the gut; (ii) a proteomic overview of this strain has been recently reported. The involvement of metabolic changes associated with rifaximin resistance was investigated by proteomic analysis performed with two-dimensional electrophoresis and mass spectrometry. Comparative proteomic mapping of BI07-wt and BI07-res revealed that most differences in protein expression patterns were genetically encoded rather than induced by antibiotic exposure. In particular, rifaximin resistance phenotype was characterized by increased expression levels of stress proteins. Overexpression of stress proteins was expected, as they represent a common non specific response by bacteria when stimulated by different shock conditions, including exposure to toxic agents like heavy metals, oxidants, acids, bile salts and antibiotics. Also, positive transcription regulators were found to be overexpressed in BI07-res, suggesting that bacteria could activate compensatory mechanisms to assist the transcription process in the presence of RNA polymerase inhibitors. Other differences in expression profiles were related to proteins involved in central metabolism; these modifications suggest metabolic disadvantages of resistant mutants in comparison with sensitive bifidobacteria in the gut environment, without selective pressure, explaining their disappearance from faeces of patients with UC after interruption of antibiotic treatment. The differences observed between BI07-wt e BI07-res proteomic patterns, as well as the high frequency of silent mutations reported for resistant mutants of Bifidobacterium could be the consequences of an increased mutation rate, mechanism which may lead to persistence of resistant bacteria in the population. However, the in vivo disappearance of resistant mutants in absence of selective pressure, allows excluding the upsurge of compensatory mutations without loss of resistance. Furthermore, the proteomic characterization of the resistant phenotype suggests that rifaximin resistance is associated with a reduced bacterial fitness in B. lactis BI07-res, supporting the hypothesis of a biological cost of antibiotic resistance in Bifidobacterium. The hypothesis of rifaximin inactivation by bacterial enzymatic activities was verified by using liquid chromatography coupled with tandem mass spectrometry. Neither chemical modifications nor degradation derivatives of the rifaximin moiety were detected. The exclusion of a biodegradation pattern for the drug was further supported by the quantitative recovery in BI07-res culture fractions of the total rifaximin amount (100 μg/ml) added to the culture medium. To confirm the main role of the mutation on the β chain of RNA polymerase in rifaximin resistance acquisition, transcription activity of crude enzymatic extracts of BI07-res cells was evaluated. Although the inhibition effects of rifaximin on in vitro transcription were definitely higher for BI07-wt than for BI07-res, a partial resistance of the mutated RNA polymerase at rifaximin concentrations > 10 μg/ml was supposed, on the basis of the calculated differences in inhibition percentages between BI07-wt and BI07-res. By considering the resistance of entire BI07-res cells to rifaximin concentrations > 100 μg/ml, supplementary resistance mechanisms may take place in vivo. A barrier for the rifaximin uptake in BI07-res cells was suggested in this study, on the basis of the major portion of the antibiotic found to be bound to the cellular pellet respect to the portion recovered in the cellular lysate. Related to this finding, a resistance mechanism involving changes of membrane permeability was supposed. A previous study supports this hypothesis, demonstrating the involvement of surface properties and permeability in natural resistance to rifampicin in mycobacteria, isolated from cases of human infection, which possessed a rifampicin-susceptible RNA polymerase. To understand the mechanism of membrane barrier, variations in percentage of saturated and unsaturated FAs and their methylation products in BI07-wt and BI07-res membranes were investigated. While saturated FAs confer rigidity to membrane and resistance to stress agents, such as antibiotics, a high level of lipid unsaturation is associated with high fluidity and susceptibility to stresses. Thus, the higher percentage of saturated FAs during the stationary phase of BI07-res could represent a defence mechanism of mutant cells to prevent the antibiotic uptake. Furthermore, the increase of CFAs such as dihydrosterculic acid during the stationary phase of BI07-res suggests that this CFA could be more suitable than its isomer lactobacillic acid to interact with and prevent the penetration of exogenous molecules including rifaximin. Finally, the impact of rifaximin on immune regulatory functions of the gut was evaluated. It has been suggested a potential anti-inflammatory effect of rifaximin, with reduced secretion of IFN-γ in a rodent model of colitis. Analogously, it has been reported a significant decrease in IL-8, MCP-1, MCP-3 e IL-10 levels in patients affected by pouchitis, treated with a combined therapy of rifaximin and ciprofloxacin. Since rifaximin enables in vivo and in vitro selection of Bifidobacterium resistant mutants with high frequency, the immunomodulation activities of rifaximin associated with a B. lactis resistant mutant were also taken into account. Data obtained from PBMC stimulation experiments suggest the following conclusions: (i) rifaximin does not exert any effect on production of IL-1β, IL-6 and IL-10, whereas it weakly stimulates production of TNF-α; (ii) B. lactis appears as a good inducer of IL-1β, IL-6 and TNF-α; (iii) combination of BI07-res and rifaximin exhibits a lower stimulation effect than BI07-res alone, especially for IL-6. These results confirm the potential anti-inflammatory effect of rifaximin, and are in agreement with several studies that report a transient pro-inflammatory response associated with probiotic administration. The understanding of the molecular factors determining rifaximin resistance in the genus Bifidobacterium assumes an applicative significance at pharmaceutical and medical level, as it represents the scientific basis to justify the simultaneous use of the antibiotic rifaximin and probiotic bifidobacteria in the clinical treatment of intestinal disorders.
Resumo:
The dynamicity and heterogeneity that characterize pervasive environments raise new challenges in the design of mobile middleware. Pervasive environments are characterized by a significant degree of heterogeneity, variability, and dynamicity that conventional middleware solutions are not able to adequately manage. Originally designed for use in a relatively static context, such middleware systems tend to hide low-level details to provide applications with a transparent view on the underlying execution platform. In mobile environments, however, the context is extremely dynamic and cannot be managed by a priori assumptions. Novel middleware should therefore support mobile computing applications in the task of adapting their behavior to frequent changes in the execution context, that is, it should become context-aware. In particular, this thesis has identified the following key requirements for novel context-aware middleware that existing solutions do not fulfil yet. (i) Middleware solutions should support interoperability between possibly unknown entities by providing expressive representation models that allow to describe interacting entities, their operating conditions and the surrounding world, i.e., their context, according to an unambiguous semantics. (ii) Middleware solutions should support distributed applications in the task of reconfiguring and adapting their behavior/results to ongoing context changes. (iii) Context-aware middleware support should be deployed on heterogeneous devices under variable operating conditions, such as different user needs, application requirements, available connectivity and device computational capabilities, as well as changing environmental conditions. Our main claim is that the adoption of semantic metadata to represent context information and context-dependent adaptation strategies allows to build context-aware middleware suitable for all dynamically available portable devices. Semantic metadata provide powerful knowledge representation means to model even complex context information, and allow to perform automated reasoning to infer additional and/or more complex knowledge from available context data. In addition, we suggest that, by adopting proper configuration and deployment strategies, semantic support features can be provided to differentiated users and devices according to their specific needs and current context. This thesis has investigated novel design guidelines and implementation options for semantic-based context-aware middleware solutions targeted to pervasive environments. These guidelines have been applied to different application areas within pervasive computing that would particularly benefit from the exploitation of context. Common to all applications is the key role of context in enabling mobile users to personalize applications based on their needs and current situation. The main contributions of this thesis are (i) the definition of a metadata model to represent and reason about context, (ii) the definition of a model for the design and development of context-aware middleware based on semantic metadata, (iii) the design of three novel middleware architectures and the development of a prototypal implementation for each of these architectures, and (iv) the proposal of a viable approach to portability issues raised by the adoption of semantic support services in pervasive applications.
Resumo:
The advent of distributed and heterogeneous systems has laid the foundation for the birth of new architectural paradigms, in which many separated and autonomous entities collaborate and interact to the aim of achieving complex strategic goals, impossible to be accomplished on their own. A non exhaustive list of systems targeted by such paradigms includes Business Process Management, Clinical Guidelines and Careflow Protocols, Service-Oriented and Multi-Agent Systems. It is largely recognized that engineering these systems requires novel modeling techniques. In particular, many authors are claiming that an open, declarative perspective is needed to complement the closed, procedural nature of the state of the art specification languages. For example, the ConDec language has been recently proposed to target the declarative and open specification of Business Processes, overcoming the over-specification and over-constraining issues of classical procedural approaches. On the one hand, the success of such novel modeling languages strongly depends on their usability by non-IT savvy: they must provide an appealing, intuitive graphical front-end. On the other hand, they must be prone to verification, in order to guarantee the trustworthiness and reliability of the developed model, as well as to ensure that the actual executions of the system effectively comply with it. In this dissertation, we claim that Computational Logic is a suitable framework for dealing with the specification, verification, execution, monitoring and analysis of these systems. We propose to adopt an extended version of the ConDec language for specifying interaction models with a declarative, open flavor. We show how all the (extended) ConDec constructs can be automatically translated to the CLIMB Computational Logic-based language, and illustrate how its corresponding reasoning techniques can be successfully exploited to provide support and verification capabilities along the whole life cycle of the targeted systems.
Resumo:
9-hydroxystearic acid (9-HSA) is an endogenous lipoperoxidation product and its administration to HT29, a colon adenocarcinoma cell line, induced a proliferative arrest in G0/G1 phase mediated by a direct activation of the p21WAF1 gene, bypassing p53. We have previously shown that 9-HSA controls cell growth and differentiation by inhibiting histone deacetylase 1 (HDAC1) activity, showing interesting features as a new anticancer drug. The interaction of 9-HSA with the catalytic site of the 3D model has been tested with a docking procedure: noticeably, when interacting with the site, the (R)-9-enantiomer is more stable than the (S) one. Thus, in this study, (R)- and (S)-9-HSA were synthesized and their biological activity tested in HT29 cells. At the concentration of 50 M (R)-9-HSA showed a stronger antiproliferative effect than the (S) isomer, as indicated by the growth arrest in G0/G1. The inhibitory effect of (S)-9-HSA on HDAC1, HDAC2 and HDAC3 activity was less effective than that of the (R)-9-HSA in vitro, and the inhibitory activity of both the (R)- and the (S)-9-HSA isomer, was higher on HDAC1 compared to HDAC2 and HDAC3, thus demonstrating the stereospecific and selective interaction of 9-HSA with HDAC1. In addition, histone hyperacetylation caused by 9-HSA treatment was examined by an innovative HPLC/ESI/MS method. Analysis on histones isolated from control and treated HT29 confirmed the higher potency of (R)-9-HSA compared to (S)-9-HSA, severely affecting H2A-2 and H4 acetylation. On the other side, it seemed of interest to determine whether the G0/G1 arrest of HT29 cell proliferation could be bypassed by the stimulation with the growth factor EGF. Our results showed that 9-HSA-treated cells were not only prevented from proliferating, but also showed a decreased [3H]thymidine incorporation after EGF stimulation. In this condition, HT29 cells expressed very low levels of cyclin D1, that didn’t colocalize with HDAC1. These results suggested that the cyclin D1/HDAC1 complex is required for proliferation. Furthermore, in the effort of understanding the possible mechanisms of this effect, we have analyzed the degree of internalization of the EGF/EGFR complex and its interactions with HDAC1. EGF/EGFR/HDAC1 complex quantitatively increases in 9-HSA-treated cells but not in serum starved cells after EGF stimulation. Our data suggested that 9-HSA interaction with the catalytic site of the HDAC1 disrupts the HDAC1/cyclin D1 complex and favors EGF/EGFR recruitment by HDAC1, thus enhancing 9-HSA antiproliferative effects. In conclusion 9-HSA is a promising HDAC inhibitor with high selectivity and specificity, capable of inducing cell cycle arrest and histone hyperacetylation, but also able to modulate HDAC1 protein interaction. All these aspects may contribute to the potency of this new antitumor agent.
Resumo:
The β-Amyloid (βA) peptide is the major component of senile plaques that are one of the hallmarks of Alzheimer’s Disease (AD). It is well recognized that Aβ exists in multiple assembly states, such as soluble oligomers or insoluble fibrils, which affect neuronal viability and may contribute to disease progression. In particular, common βA-neurotoxic mechanisms are Ca2+ dyshomeostasis, reactive oxygen species (ROS) formation, altered signaling, mitochondrial dysfunction and neuronal death such as necrosis and apoptosis. Recent study shows that the ubiquitin-proteasome pathway play a crucial role in the degradation of short-lived and regulatory proteins that are important in a variety of basic and pathological cellular processes including apoptosis. Guanosine (Guo) is a purine nucleoside present extracellularly in brain that shows a spectrum of biological activities, both under physiological and pathological conditions. Recently it has become recognized that both neurons and glia also release guanine-based purines. However, the role of Guo in AD is still not well established. In this study, we investigated the machanism basis of neuroprotective effects of GUO against Aβ peptide-induced toxicity in neuronal (SH-SY5Y), in terms of mitochondrial dysfunction and translocation of phosphatidylserine (PS), a marker of apoptosis, using MTT and Annexin-V assay, respectively. In particular, treatment of SH-SY5Y cells with GUO (12,5-75 μM) in presence of monomeric βA25-35 (neurotoxic core of Aβ), oligomeric and fibrillar βA1-42 peptides showed a strong dose-dependent inhibitory effects on βA-induced toxic events. The maximum inhibition of mitochondrial function loss and PS translocation was observed with 75 μM of Guo. Subsequently, to investigate whether neuroprotection of Guo can be ascribed to its ability to modulate proteasome activity levels, we used lactacystin, a specific inhibitor of proteasome. We found that the antiapoptotic effects of Guo were completely abolished by lactacystin. To rule out the possibility that this effects resulted from an increase in proteasome activity by Guo, the chymotrypsin-like activity was assessed employing the fluorogenic substrate Z-LLL-AMC. The treatment of SH-SY5Y with Guo (75 μM for 0-6 h) induced a strong increase, in a time-dependent manner, of proteasome activity. In parallel, no increase of ubiquitinated protein levels was observed at similar experimental conditions adopted. We then evaluated an involvement of anti and pro-apoptotic proteins such as Bcl-2, Bad and Bax by western blot analysis. Interestingly, Bax levels decreased after 2 h treatment of SH-SY5Y with Guo. Taken together, these results demonstrate that Guo neuroprotective effects against βA-induced apoptosis are mediated, at least partly, via proteasome activation. In particular, these findings suggest a novel neuroprotective pathway mediated by Guo, which involves a rapid degradation of pro-apoptotic proteins by the proteasome. In conclusion, the present data, raise the possibility that Guo could be used as an agent for the treatment of AD.
Resumo:
Two of the main features of today complex software systems like pervasive computing systems and Internet-based applications are distribution and openness. Distribution revolves around three orthogonal dimensions: (i) distribution of control|systems are characterised by several independent computational entities and devices, each representing an autonomous and proactive locus of control; (ii) spatial distribution|entities and devices are physically distributed and connected in a global (such as the Internet) or local network; and (iii) temporal distribution|interacting system components come and go over time, and are not required to be available for interaction at the same time. Openness deals with the heterogeneity and dynamism of system components: complex computational systems are open to the integration of diverse components, heterogeneous in terms of architecture and technology, and are dynamic since they allow components to be updated, added, or removed while the system is running. The engineering of open and distributed computational systems mandates for the adoption of a software infrastructure whose underlying model and technology could provide the required level of uncoupling among system components. This is the main motivation behind current research trends in the area of coordination middleware to exploit tuple-based coordination models in the engineering of complex software systems, since they intrinsically provide coordinated components with communication uncoupling and further details in the references therein. An additional daunting challenge for tuple-based models comes from knowledge-intensive application scenarios, namely, scenarios where most of the activities are based on knowledge in some form|and where knowledge becomes the prominent means by which systems get coordinated. Handling knowledge in tuple-based systems induces problems in terms of syntax - e.g., two tuples containing the same data may not match due to differences in the tuple structure - and (mostly) of semantics|e.g., two tuples representing the same information may not match based on a dierent syntax adopted. Till now, the problem has been faced by exploiting tuple-based coordination within a middleware for knowledge intensive environments: e.g., experiments with tuple-based coordination within a Semantic Web middleware (surveys analogous approaches). However, they appear to be designed to tackle the design of coordination for specic application contexts like Semantic Web and Semantic Web Services, and they result in a rather involved extension of the tuple space model. The main goal of this thesis was to conceive a more general approach to semantic coordination. In particular, it was developed the model and technology of semantic tuple centres. It is adopted the tuple centre model as main coordination abstraction to manage system interactions. A tuple centre can be seen as a programmable tuple space, i.e. an extension of a Linda tuple space, where the behaviour of the tuple space can be programmed so as to react to interaction events. By encapsulating coordination laws within coordination media, tuple centres promote coordination uncoupling among coordinated components. Then, the tuple centre model was semantically enriched: a main design choice in this work was to try not to completely redesign the existing syntactic tuple space model, but rather provide a smooth extension that { although supporting semantic reasoning { keep the simplicity of tuple and tuple matching as easier as possible. By encapsulating the semantic representation of the domain of discourse within coordination media, semantic tuple centres promote semantic uncoupling among coordinated components. The main contributions of the thesis are: (i) the design of the semantic tuple centre model; (ii) the implementation and evaluation of the model based on an existent coordination infrastructure; (iii) a view of the application scenarios in which semantic tuple centres seem to be suitable as coordination media.
Resumo:
Aim: To evaluate the early response to treatment to an antiangiogenetic drug (sorafenib) in a heterotopic murine model of hepatocellular carcinoma (HCC) using ultrasonographic molecular imaging. Material and Methods: the xenographt model was established injecting a suspension of HuH7 cells subcutaneously in 19 nude mice. When tumors reached a mean diameter of 5-10 mm, they were divided in two groups (treatment and vehicle). The treatment group received sorafenib (62 mg/kg) by daily oral gavage for 14 days. Molecular imaging was performed using contrast enhanced ultrasound (CEUS), by injecting into the mouse venous circulation a suspension of VEGFR-2 targeted microbubbles (BR55, kind gift of Bracco Swiss, Geneve, Switzerland). Video clips were acquired for 6 minutes, then microbubbles (MBs) were destroyed by a high mechanical index (MI) impulse, and another minute was recorded to evaluate residual circulating MBs. The US protocol was repeated at day 0,+2,+4,+7, and +14 from the beginning of treatment administration. Video clips were analyzed using a dedicated software (Sonotumor, Bracco Swiss) to quantify the signal of the contrast agent. Time/intensity curves were obtained and the difference of the mean MBs signal before and after high MI impulse (Differential Targeted Enhancement-dTE) was calculated. dTE represents a numeric value in arbitrary units proportional to the amount of bound MBs. At day +14 mice were euthanized and the tumors analyzed for VEGFR-2, pERK, and CD31 tissue levels using western blot analysis. Results: dTE values decreased from day 0 to day +14 both in treatment and vehicle groups, and they were statistically higher in vehicle group than in treatment group at day +2, at day +7, and at day +14. With respect to the degree of tumor volume increase, measured as growth percentage delta (GPD), treatment group was divided in two sub-groups, non-responders (GPD>350%), and responders (GPD<200%). In the same way vehicle group was divided in slow growth group (GPD<400%), and fast growth group (GPD>900%). dTE values at day 0 (immediately before treatment start) were higher in non-responders than in responders group, with statistical difference at day 2. While dTE values were higher in the fast growth group than in the slow growth group only at day 0. A significant positive correlation was found between VEGFR-2 tissue levels and dTE values, confirming that level of BR55 tissue enhancement reflects the amount of tissue VEGF receptor. Conclusions: the present findings show that, at least in murine experimental models, CEUS with BR55 is feasable and appears to be a useful tool in the prediction of tumor growth and response to sorafenib treatment in xenograft HCC.
Resumo:
A polar stratospheric cloud submodel has been developed and incorporated in a general circulation model including atmospheric chemistry (ECHAM5/MESSy). The formation and sedimentation of polar stratospheric cloud (PSC) particles can thus be simulated as well as heterogeneous chemical reactions that take place on the PSC particles. For solid PSC particle sedimentation, the need for a tailor-made algorithm has been elucidated. A sedimentation scheme based on first order approximations of vertical mixing ratio profiles has been developed. It produces relatively little numerical diffusion and can deal well with divergent or convergent sedimentation velocity fields. For the determination of solid PSC particle sizes, an efficient algorithm has been adapted. It assumes a monodisperse radii distribution and thermodynamic equilibrium between the gas phase and the solid particle phase. This scheme, though relatively simple, is shown to produce particle number densities and radii within the observed range. The combined effects of the representations of sedimentation and solid PSC particles on vertical H2O and HNO3 redistribution are investigated in a series of tests. The formation of solid PSC particles, especially of those consisting of nitric acid trihydrate, has been discussed extensively in recent years. Three particle formation schemes in accordance with the most widely used approaches have been identified and implemented. For the evaluation of PSC occurrence a new data set with unprecedented spatial and temporal coverage was available. A quantitative method for the comparison of simulation results and observations is developed and applied. It reveals that the relative PSC sighting frequency can be reproduced well with the PSC submodel whereas the detailed modelling of PSC events is beyond the scope of coarse global scale models. In addition to the development and evaluation of new PSC submodel components, parts of existing simulation programs have been improved, e.g. a method for the assimilation of meteorological analysis data in the general circulation model, the liquid PSC particle composition scheme, and the calculation of heterogeneous reaction rate coefficients. The interplay of these model components is demonstrated in a simulation of stratospheric chemistry with the coupled general circulation model. Tests against recent satellite data show that the model successfully reproduces the Antarctic ozone hole.
Resumo:
Since the development of quantum mechanics it has been natural to analyze the connection between classical and quantum mechanical descriptions of physical systems. In particular one should expect that in some sense when quantum mechanical effects becomes negligible the system will behave like it is dictated by classical mechanics. One famous relation between classical and quantum theory is due to Ehrenfest. This result was later developed and put on firm mathematical foundations by Hepp. He proved that matrix elements of bounded functions of quantum observables between suitable coherents states (that depend on Planck's constant h) converge to classical values evolving according to the expected classical equations when h goes to zero. His results were later generalized by Ginibre and Velo to bosonic systems with infinite degrees of freedom and scattering theory. In this thesis we study the classical limit of Nelson model, that describes non relativistic particles, whose evolution is dictated by Schrödinger equation, interacting with a scalar relativistic field, whose evolution is dictated by Klein-Gordon equation, by means of a Yukawa-type potential. The classical limit is a mean field and weak coupling limit. We proved that the transition amplitude of a creation or annihilation operator, between suitable coherent states, converges in the classical limit to the solution of the system of differential equations that describes the classical evolution of the theory. The quantum evolution operator converges to the evolution operator of fluctuations around the classical solution. Transition amplitudes of normal ordered products of creation and annihilation operators between coherent states converge to suitable products of the classical solutions. Transition amplitudes of normal ordered products of creation and annihilation operators between fixed particle states converge to an average of products of classical solutions, corresponding to different initial conditions.
Resumo:
Die Arbeit behandelt das Problem der Skalierbarkeit von Reinforcement Lernen auf hochdimensionale und komplexe Aufgabenstellungen. Unter Reinforcement Lernen versteht man dabei eine auf approximativem Dynamischen Programmieren basierende Klasse von Lernverfahren, die speziell Anwendung in der Künstlichen Intelligenz findet und zur autonomen Steuerung simulierter Agenten oder realer Hardwareroboter in dynamischen und unwägbaren Umwelten genutzt werden kann. Dazu wird mittels Regression aus Stichproben eine Funktion bestimmt, die die Lösung einer "Optimalitätsgleichung" (Bellman) ist und aus der sich näherungsweise optimale Entscheidungen ableiten lassen. Eine große Hürde stellt dabei die Dimensionalität des Zustandsraums dar, die häufig hoch und daher traditionellen gitterbasierten Approximationsverfahren wenig zugänglich ist. Das Ziel dieser Arbeit ist es, Reinforcement Lernen durch nichtparametrisierte Funktionsapproximation (genauer, Regularisierungsnetze) auf -- im Prinzip beliebig -- hochdimensionale Probleme anwendbar zu machen. Regularisierungsnetze sind eine Verallgemeinerung von gewöhnlichen Basisfunktionsnetzen, die die gesuchte Lösung durch die Daten parametrisieren, wodurch die explizite Wahl von Knoten/Basisfunktionen entfällt und so bei hochdimensionalen Eingaben der "Fluch der Dimension" umgangen werden kann. Gleichzeitig sind Regularisierungsnetze aber auch lineare Approximatoren, die technisch einfach handhabbar sind und für die die bestehenden Konvergenzaussagen von Reinforcement Lernen Gültigkeit behalten (anders als etwa bei Feed-Forward Neuronalen Netzen). Allen diesen theoretischen Vorteilen gegenüber steht allerdings ein sehr praktisches Problem: der Rechenaufwand bei der Verwendung von Regularisierungsnetzen skaliert von Natur aus wie O(n**3), wobei n die Anzahl der Daten ist. Das ist besonders deswegen problematisch, weil bei Reinforcement Lernen der Lernprozeß online erfolgt -- die Stichproben werden von einem Agenten/Roboter erzeugt, während er mit der Umwelt interagiert. Anpassungen an der Lösung müssen daher sofort und mit wenig Rechenaufwand vorgenommen werden. Der Beitrag dieser Arbeit gliedert sich daher in zwei Teile: Im ersten Teil der Arbeit formulieren wir für Regularisierungsnetze einen effizienten Lernalgorithmus zum Lösen allgemeiner Regressionsaufgaben, der speziell auf die Anforderungen von Online-Lernen zugeschnitten ist. Unser Ansatz basiert auf der Vorgehensweise von Recursive Least-Squares, kann aber mit konstantem Zeitaufwand nicht nur neue Daten sondern auch neue Basisfunktionen in das bestehende Modell einfügen. Ermöglicht wird das durch die "Subset of Regressors" Approximation, wodurch der Kern durch eine stark reduzierte Auswahl von Trainingsdaten approximiert wird, und einer gierigen Auswahlwahlprozedur, die diese Basiselemente direkt aus dem Datenstrom zur Laufzeit selektiert. Im zweiten Teil übertragen wir diesen Algorithmus auf approximative Politik-Evaluation mittels Least-Squares basiertem Temporal-Difference Lernen, und integrieren diesen Baustein in ein Gesamtsystem zum autonomen Lernen von optimalem Verhalten. Insgesamt entwickeln wir ein in hohem Maße dateneffizientes Verfahren, das insbesondere für Lernprobleme aus der Robotik mit kontinuierlichen und hochdimensionalen Zustandsräumen sowie stochastischen Zustandsübergängen geeignet ist. Dabei sind wir nicht auf ein Modell der Umwelt angewiesen, arbeiten weitestgehend unabhängig von der Dimension des Zustandsraums, erzielen Konvergenz bereits mit relativ wenigen Agent-Umwelt Interaktionen, und können dank des effizienten Online-Algorithmus auch im Kontext zeitkritischer Echtzeitanwendungen operieren. Wir demonstrieren die Leistungsfähigkeit unseres Ansatzes anhand von zwei realistischen und komplexen Anwendungsbeispielen: dem Problem RoboCup-Keepaway, sowie der Steuerung eines (simulierten) Oktopus-Tentakels.
Resumo:
Mainstream hardware is becoming parallel, heterogeneous, and distributed on every desk, every home and in every pocket. As a consequence, in the last years software is having an epochal turn toward concurrency, distribution, interaction which is pushed by the evolution of hardware architectures and the growing of network availability. This calls for introducing further abstraction layers on top of those provided by classical mainstream programming paradigms, to tackle more effectively the new complexities that developers have to face in everyday programming. A convergence it is recognizable in the mainstream toward the adoption of the actor paradigm as a mean to unite object-oriented programming and concurrency. Nevertheless, we argue that the actor paradigm can only be considered a good starting point to provide a more comprehensive response to such a fundamental and radical change in software development. Accordingly, the main objective of this thesis is to propose Agent-Oriented Programming (AOP) as a high-level general purpose programming paradigm, natural evolution of actors and objects, introducing a further level of human-inspired concepts for programming software systems, meant to simplify the design and programming of concurrent, distributed, reactive/interactive programs. To this end, in the dissertation first we construct the required background by studying the state-of-the-art of both actor-oriented and agent-oriented programming, and then we focus on the engineering of integrated programming technologies for developing agent-based systems in their classical application domains: artificial intelligence and distributed artificial intelligence. Then, we shift the perspective moving from the development of intelligent software systems, toward general purpose software development. Using the expertise maturated during the phase of background construction, we introduce a general-purpose programming language named simpAL, which founds its roots on general principles and practices of software development, and at the same time provides an agent-oriented level of abstraction for the engineering of general purpose software systems.
Resumo:
In der vorliegenden Arbeit wurden Zielstrukturen autologer, tumorreaktiver CD8+ T-Zellen im Modell des Melanompatienten D41 charakterisiert, der im metastasierten Stadium nach Vakzinierung mit autologen dendritischen Zellen und bestrahlten Tumorzellen eine dauerhafte komplette Remission erreichte (O´Rourke et al., Melanoma Res. 17:316, 2007). Aus kryokonservierten Blutlymphozyten verschiedener Zeitpunkte wurden durch Stimulation mit autologen Tumorzellen (D41-MEL) in unabhängigen gemischten Lymphozyten-/Tumorzell-Kulturen (MLTCs) tumorreaktive CD8+ T-Zellen angereichert. Als Erstes wurde überprüft, ob sie gegen bekannte Melanomantigene in Assoziation mit den HLA-Klasse I-Allelen des Patienten gerichtet waren. Dabei zeigten sich Reaktivitäten gegen das melanosomale Differenzierungsantigen Melan-A mit HLA-A*0201 und darüber hinaus gegen die Cancer/Testis-Antigene (CTA) MAGE-A3 und MAGE-A6 mit HLA-A*0101, sowie NY-ESO-1, MAGE-A4 und MAGE-A10 mit HLA-A*0201. In einem zweiten Schritt wurde mit T-Zell-Klonen aus D41-MLTC 2, die keines dieser Antigene erkannten, eine cDNA-Expressionsbank von D41-MEL gescreent. Dies führte zur Klonierung einer für TSPY 1 (testis-specific protein Y-encoded 1) kodierenden cDNA mit einem der T-Zell-Klone. Er erkannte mit hoher Affinität die synthetischen TSPY 1-Peptide LLDDIMAEV (Aminosäurepositionen 66-73) und LLLDDIMAEV (Aminosäurepositionen 65-73) in Assoziation mit HLA-A*0201. Serologische Immunantworten gegen das als CTA einzustufende TSPY 1 sind bekannt. In der vorliegenden Arbeit wurde erstmals eine T-Zell-Antwort gegen TSPY 1 nachgewiesen. TSPY 1 trägt mutmaßlich zu Entstehung des Gonadoblastoms bei, seine Expression wurde jedoch z.B. auch in Seminomen, Leberzellkarzinomen und Melanomen nachgewiesen. Die Expression von TSPY 1 in der Zelllinie D41-MEL-Zellen war sehr heterogen. Einzelne Klone der Linie exprimierten TSPY 1 auf stabil hohem, andere Klone auf ebenso stabil intermediärem bzw. nicht detektierbarem Niveau. Die Expression und die Erkennung durch TSPY 1-reaktive T-Zell-Klone wurde durch die demethylierende Substanz 5-Aza-2´-deoxycytidine gesteigert. Dies spricht für eine Promotor-Hypermethylierung als Ursache fehlender bzw. niedriger Expression, wie dies für verschiedene CTA zutrifft. Die im Blut des Patienten D41 detektierbare antitumorale T-Zell-Reaktivität war bereits vor der Vakzinierung mit Tumorzellen nachweisbar und hatte sich somit spontan entwickelt. Ihre Individualität war vorgegeben durch das Antigenexpressionsmuster der D41-Tumorzellen, sowie durch den HLA-Phänotyp und mutmaßlich auch das T-Zellrepertoire des Patienten. Die detaillierte Analyse komplexer antitumoraler T-Zellantworten legt den Grundstein für eine Immuntherapie, die sich auf das tatsächliche Potential des individuellen T-Zellsystems stützen kann.
Resumo:
This thesis deals with heterogeneous architectures in standard workstations. Heterogeneous architectures represent an appealing alternative to traditional supercomputers because they are based on commodity components fabricated in large quantities. Hence their price-performance ratio is unparalleled in the world of high performance computing (HPC). In particular, different aspects related to the performance and consumption of heterogeneous architectures have been explored. The thesis initially focuses on an efficient implementation of a parallel application, where the execution time is dominated by an high number of floating point instructions. Then the thesis touches the central problem of efficient management of power peaks in heterogeneous computing systems. Finally it discusses a memory-bounded problem, where the execution time is dominated by the memory latency. Specifically, the following main contributions have been carried out: A novel framework for the design and analysis of solar field for Central Receiver Systems (CRS) has been developed. The implementation based on desktop workstation equipped with multiple Graphics Processing Units (GPUs) is motivated by the need to have an accurate and fast simulation environment for studying mirror imperfection and non-planar geometries. Secondly, a power-aware scheduling algorithm on heterogeneous CPU-GPU architectures, based on an efficient distribution of the computing workload to the resources, has been realized. The scheduler manages the resources of several computing nodes with a view to reducing the peak power. The two main contributions of this work follow: the approach reduces the supply cost due to high peak power whilst having negligible impact on the parallelism of computational nodes. from another point of view the developed model allows designer to increase the number of cores without increasing the capacity of the power supply unit. Finally, an implementation for efficient graph exploration on reconfigurable architectures is presented. The purpose is to accelerate graph exploration, reducing the number of random memory accesses.
Resumo:
This thesis reports on the creation and analysis of many-body states of interacting fermionic atoms in optical lattices. The realized system can be described by the Fermi-Hubbard hamiltonian, which is an important model for correlated electrons in modern condensed matter physics. In this way, ultra-cold atoms can be utilized as a quantum simulator to study solid state phenomena. The use of a Feshbach resonance in combination with a blue-detuned optical lattice and a red-detuned dipole trap enables an independent control over all relevant parameters in the many-body hamiltonian. By measuring the in-situ density distribution and doublon fraction it has been possible to identify both metallic and insulating phases in the repulsive Hubbard model, including the experimental observation of the fermionic Mott insulator. In the attractive case, the appearance of strong correlations has been detected via an anomalous expansion of the cloud that is caused by the formation of non-condensed pairs. By monitoring the in-situ density distribution of initially localized atoms during the free expansion in a homogeneous optical lattice, a strong influence of interactions on the out-of-equilibrium dynamics within the Hubbard model has been found. The reported experiments pave the way for future studies on magnetic order and fermionic superfluidity in a clean and well-controlled experimental system.