920 resultados para Coupling and Integration of Hydrologic Models II
Resumo:
Herpes simplex virus 1 (HSV-1) infects oral epitelial cells, then spreads to the nerve endings and estabilishes latency in sensory ganglia, from where it may, or may not reactivate. Diseases caused by virus reactivation include mild diseases such as muco-cutaneous lesions, and more severe, and even life-threatening encephalitis, or systemic infections affecting diverse organs. Herpes simplex virus represents the most comprehensive example of virus receptor interaction in Herpesviridae family, and the prototype virus encoding multipartite entry genes. In fact, it encodes 11-12 glycoproteins and a number of additional membrane proteins: five of these proteins play key roles in virus entry into subsceptible cells. Thus, glycoprotein B (gB) and glycoprotein C (gC) interact with heparan sulfate proteoglycan to enable initial attachment to cell surfaces. In the next step, in the entry cascade, gD binds a specific surface receptor such as nectin1 or HVEM. The interaction of glycoprotein D with the receptor alters the conformation of gD to enable the activation of gB, glycoprotein H, and glycoprotein L, a trio of glycoproteins that execute the fusion of the viral envelope with the plasma membrane. In this thesis, I described two distinct projects: I. The retargeting of viral tropism for the design of oncolytic Herpesviruses: • capable of infecting cells through the human epitelial growth factor receptor 2 (HER2), overexpressed in highly malignant mammary and ovarian tumors and correlates with a poor prognosis; • detargeted from its natural receptors, HVEM and nectin1. To this end, we inserted a ligand to HER2 in gD. Because HER2 has no natural ligand, the selected ligand was a single chain antibody (scFv) derived from MAb4D5 (monoclonal antibody to HER2), herein designated scHER2. All recombinant viruses were targeted to HER2 receptor, but only two viruses (R-LM113 and R-LM249) were completely detargeted from HVEM and nectin1. To engineer R-LM113, we removed a large portion at the N-terminus of gD (from aa 6 to aa 38) and inserted scHER2 sequence plus 9-aa serine-glycine flexible linker at position 39. On the other hand, to engineer R-LM249, we replaced the Ig-folded core of gD (from aa 61 to aa 218) with scHER2 flanked by Ser-Gly linkers. In summary, these results provide evidence that: i. gD can tolerate an insert almost as big as gD itself; ii. the Ig-like domain of gD can be removed; iii. the large portion at the N-terminus of gD (from aa 6 to aa 38) can be removed without loss of key function; iv. R-LM113 and R-LM249 recombinants are ready to be assayed in animal models of mammary and ovary tumour. This finding and the avaibility of a large number of scFv greatly increase the collection of potential receptors to which HSV can be redirected. II. The production and purification of recombinant truncated form of the heterodimer gHgL. We cloned a stable insect cell line expressing a soluble form of gH in complex with gL under the control of a metalloprotein inducible promoter and purified the heterodimer by means of ONE-STrEP-tag system by IBA. With respect to biological function, the purified heterodimer is capable: • of reacting to antibodies that recognize conformation dependent epitopes and neutralize virion infectivity; • of binding a variety cells at cell surface. No doubt, the availability of biological active purified gHgL heterodimer, in sufficient quantities, will speed up the efforts to solve its crystal structure and makes it feasible to identify more clearly whether gHgL has a cellular partner, and what is the role of this interaction on virus entry.
Resumo:
Programa de doctorado: Sanidad animal
Resumo:
Nowadays, in Ubiquitous computing scenarios users more and more require to exploit online contents and services by means of any device at hand, no matter their physical location, and by personalizing and tailoring content and service access to their own requirements. The coordinated provisioning of content tailored to user context and preferences, and the support for mobile multimodal and multichannel interactions are of paramount importance in providing users with a truly effective Ubiquitous support. However, so far the intrinsic heterogeneity and the lack of an integrated approach led to several either too vertical, or practically unusable proposals, thus resulting in poor and non-versatile support platforms for Ubiquitous computing. This work investigates and promotes design principles to help cope with these ever-changing and inherently dynamic scenarios. By following the outlined principles, we have designed and implemented a middleware support platform to support the provisioning of Ubiquitous mobile services and contents. To prove the viability of our approach, we have realized and stressed on top of our support platform a number of different, extremely complex and heterogeneous content and service provisioning scenarios. The encouraging results obtained are pushing our research work further, in order to provide a dynamic platform that is able to not only dynamically support novel Ubiquitous applicative scenarios by tailoring extremely diverse services and contents to heterogeneous user needs, but is also able to reconfigure and adapt itself in order to provide a truly optimized and tailored support for Ubiquitous service provisioning.
Resumo:
The advent of distributed and heterogeneous systems has laid the foundation for the birth of new architectural paradigms, in which many separated and autonomous entities collaborate and interact to the aim of achieving complex strategic goals, impossible to be accomplished on their own. A non exhaustive list of systems targeted by such paradigms includes Business Process Management, Clinical Guidelines and Careflow Protocols, Service-Oriented and Multi-Agent Systems. It is largely recognized that engineering these systems requires novel modeling techniques. In particular, many authors are claiming that an open, declarative perspective is needed to complement the closed, procedural nature of the state of the art specification languages. For example, the ConDec language has been recently proposed to target the declarative and open specification of Business Processes, overcoming the over-specification and over-constraining issues of classical procedural approaches. On the one hand, the success of such novel modeling languages strongly depends on their usability by non-IT savvy: they must provide an appealing, intuitive graphical front-end. On the other hand, they must be prone to verification, in order to guarantee the trustworthiness and reliability of the developed model, as well as to ensure that the actual executions of the system effectively comply with it. In this dissertation, we claim that Computational Logic is a suitable framework for dealing with the specification, verification, execution, monitoring and analysis of these systems. We propose to adopt an extended version of the ConDec language for specifying interaction models with a declarative, open flavor. We show how all the (extended) ConDec constructs can be automatically translated to the CLIMB Computational Logic-based language, and illustrate how its corresponding reasoning techniques can be successfully exploited to provide support and verification capabilities along the whole life cycle of the targeted systems.
Resumo:
In dieser interdisziplinären, translationswissenschaftlichen Studie wird die Integration von Curriculum und Evaluierung in der Dolmetscherausbildung theoretisch fundiert und im Rahmen einer Fallstudie empirisch untersucht. Dolmetschkompetenz wird als ein durch zweckgerechte und messgenaue (valid and reliable) Bewertungsmethoden dokumentiertes Ergebnis der Curriculumanwendung betrachtet. Definitionen, Grundlagen, Ansätze, Ausbildungs- und Lernziele werden anhand der Curriculumtheorie und Dolmetschwissenschaft beschrieben. Traditionelle und alternative Evaluierungsmethoden werden hinsichtlich ihrer Anwendbarkeit in der Dolmetscherausbildung erprobt. In der Fallstudie werden die Prüfungsergebnisse zweier Master-Studiengänge-MA Konferenzdolmetschen und MA Dolmetschen und Übersetzen-quantitativ analysiert. Die zur Dokumentation der Prüfungsergebnisse eingesetzte Bewertungsmethodik wird qualitativ untersucht und zur quantitativen Analyse in Bezug gesetzt. Die Fallstudie besteht aus 1) einer chi-square-Analyse der Abschlussprüfungsnoten getrennt nach Sprachkombination und Prüfungskategorie (n=260), 2) einer Umfrage unter den Jurymitgliedern hinsichtlich der Evaluierungsansätze, -verfahren, und -kriterien (n = 45; 62.22% Rücklaufrate); und 3) einer Analyse des ausgangssprachlichen Prüfungsmaterials ebenfalls nach Sprachkombination und Prüfungskategorie. Es wird nachgewiesen, dass Studierende im MA Dolmetschen und Übersetzen tendenziell schlechtere Prüfungsleistungen erbringen als Studierende im MA Konferenzdolmetschen. Die Analyseergebnisse werden jedoch als aussageschwach betrachtet aufgrund mangelnder Evaluierungsvalidität. Schritte zur Curriculum- und Evaluierungsoptimierung sowie ein effizienteres Curriculummodell werden aus den theoretischen Ansätzen abgeleitet. Auf die Rolle der Ethik in der Evaluierungsmethodik wird hingewiesen.
Resumo:
The hydrologic risk (and the hydro-geologic one, closely related to it) is, and has always been, a very relevant issue, due to the severe consequences that may be provoked by a flooding or by waters in general in terms of human and economic losses. Floods are natural phenomena, often catastrophic, and cannot be avoided, but their damages can be reduced if they are predicted sufficiently in advance. For this reason, the flood forecasting plays an essential role in the hydro-geological and hydrological risk prevention. Thanks to the development of sophisticated meteorological, hydrologic and hydraulic models, in recent decades the flood forecasting has made a significant progress, nonetheless, models are imperfect, which means that we are still left with a residual uncertainty on what will actually happen. In this thesis, this type of uncertainty is what will be discussed and analyzed. In operational problems, it is possible to affirm that the ultimate aim of forecasting systems is not to reproduce the river behavior, but this is only a means through which reducing the uncertainty associated to what will happen as a consequence of a precipitation event. In other words, the main objective is to assess whether or not preventive interventions should be adopted and which operational strategy may represent the best option. The main problem for a decision maker is to interpret model results and translate them into an effective intervention strategy. To make this possible, it is necessary to clearly define what is meant by uncertainty, since in the literature confusion is often made on this issue. Therefore, the first objective of this thesis is to clarify this concept, starting with a key question: should be the choice of the intervention strategy to adopt based on the evaluation of the model prediction based on its ability to represent the reality or on the evaluation of what actually will happen on the basis of the information given by the model forecast? Once the previous idea is made unambiguous, the other main concern of this work is to develope a tool that can provide an effective decision support, making possible doing objective and realistic risk evaluations. In particular, such tool should be able to provide an uncertainty assessment as accurate as possible. This means primarily three things: it must be able to correctly combine all the available deterministic forecasts, it must assess the probability distribution of the predicted quantity and it must quantify the flooding probability. Furthermore, given that the time to implement prevention strategies is often limited, the flooding probability will have to be linked to the time of occurrence. For this reason, it is necessary to quantify the flooding probability within a horizon time related to that required to implement the intervention strategy and it is also necessary to assess the probability of the flooding time.
Resumo:
Ziel der Arbeit ist die Analyse von Prinzipien der Konturintegration im menschlichen visuellen System. Die perzeptuelle Verbindung benachbarter Teile in einer visuellen Szene zu einem Ganzen wird durch zwei gestalttheoretisch begründete Propositionen gekennzeichnet, die komplementäre lokale Mechanismen der Konturintegration beschreiben. Das erste Prinzip der Konturintegration fordert, dass lokale Ähnlichkeit von Elementen in einem anderen Merkmal als Orientierung nicht hinreicht für die Entdeckung von Konturen, sondern ein zusätzlicher statistischer Merkmalsunterschied von Konturelementen und Umgebung vorliegen muss, um Konturentdeckung zu ermöglichen. Das zweite Prinzip der Konturintegration behauptet, dass eine kollineare Ausrichtung von Konturelementen für Konturintegration hinreicht, und es bei deren Vorliegen zu robuster Konturintegrationsleistung kommt, auch wenn die lokalen merkmalstragenden Elemente in anderen Merkmalen in hohem Maße zufällig variieren und damit keine nachbarschaftliche Ähnlichkeitsbeziehung entlang der Kontur aufweisen. Als empirische Grundlage für die beiden vorgeschlagenen Prinzipien der Konturintegration werden drei Experimente berichtet, die zunächst die untergeordnete Rolle globaler Konturmerkmale wie Geschlossenheit bei der Konturentdeckung aufweisen und daraufhin die Bedeutung lokaler Mechanismen für die Konturintegration anhand der Merkmale Kollinearität, Ortsfrequenz sowie der spezifischen Art der Interaktion zwischen beiden Merkmalen beleuchten. Im ersten Experiment wird das globale Merkmal der Geschlossenheit untersucht und gezeigt, dass geschlossene Konturen nicht effektiver entdeckt werden als offene Konturen. Das zweite Experiment zeigt die Robustheit von über Kollinearität definierten Konturen über die zufällige Variation im Merkmal Ortsfrequenz entlang der Kontur und im Hintergrund, sowie die Unmöglichkeit der Konturintegration bei nachbarschaftlicher Ähnlichkeit der Konturelemente, wenn Ähnlichkeit statt über kollineare Orientierung über gleiche Ortsfrequenzen realisiert ist. Im dritten Experiment wird gezeigt, dass eine redundante Kombination von kollinearer Orientierung mit einem statistischen Unterschied im Merkmal Ortsfrequenz zu erheblichen Sichtbarkeitsgewinnen bei der Konturentdeckung führt. Aufgrund der Stärke der Summationswirkung wird vorgeschlagen, dass durch die Kombination mehrerer Hinweisreize neue kortikale Mechanismen angesprochen werden, die die Konturentdeckung unterstützen. Die Resultate der drei Experimente werden in den Kontext aktueller Forschung zur Objektwahrnehmung gestellt und ihre Bedeutung für die postulierten allgemeinen Prinzipien visueller Gruppierung in der Konturintegration diskutiert. Anhand phänomenologischer Beispiele mit anderen Merkmalen als Orientierung und Ortsfrequenz wird gezeigt, dass die gefundenen Prinzipien Generalisierbarkeit für die Verarbeitung von Konturen im visuellen System beanspruchen können.
Resumo:
The relationship between emotion and cognition is a topic that raises great interest in research. Recently, a view of these two processes as interactive and mutually influencing each other has become predominant. This dissertation investigates the reciprocal influences of emotion and cognition, both at behavioral and neural level, in two specific fields, such as attention and decision-making. Experimental evidence on how emotional responses may affect perceptual and attentional processes has been reported. In addition, the impact of three factors, such as personality traits, motivational needs and social context, in modulating the influence that emotion exerts on perception and attention has been investigated. Moreover, the influence of cognition on emotional responses in decision-making has been demonstrated. The current experimental evidence showed that cognitive brain regions such as the dorsolateral prefrontal cortex are causally implicated in regulation of emotional responses and that this has an effect at both pre and post decisional stages. There are two main conclusions of this dissertation: firstly, emotion exerts a strong influence on perceptual and attentional processes but, at the same time, this influence may also be modulated by other factors internal and external to the individuals. Secondly, cognitive processes may modulate emotional prepotent responses, by serving a regulative function critical to driving and shaping human behavior in line with current goals.
Resumo:
In the last few years the resolution of numerical weather prediction (nwp) became higher and higher with the progresses of technology and knowledge. As a consequence, a great number of initial data became fundamental for a correct initialization of the models. The potential of radar observations has long been recognized for improving the initial conditions of high-resolution nwp models, while operational application becomes more frequent. The fact that many nwp centres have recently taken into operations convection-permitting forecast models, many of which assimilate radar data, emphasizes the need for an approach to providing quality information which is needed in order to avoid that radar errors degrade the model's initial conditions and, therefore, its forecasts. Environmental risks can can be related with various causes: meteorological, seismical, hydrological/hydraulic. Flash floods have horizontal dimension of 1-20 Km and can be inserted in mesoscale gamma subscale, this scale can be modeled only with nwp model with the highest resolution as the COSMO-2 model. One of the problems of modeling extreme convective events is related with the atmospheric initial conditions, in fact the scale dimension for the assimilation of atmospheric condition in an high resolution model is about 10 Km, a value too high for a correct representation of convection initial conditions. Assimilation of radar data with his resolution of about of Km every 5 or 10 minutes can be a solution for this problem. In this contribution a pragmatic and empirical approach to deriving a radar data quality description is proposed to be used in radar data assimilation and more specifically for the latent heat nudging (lhn) scheme. Later the the nvective capabilities of the cosmo-2 model are investigated through some case studies. Finally, this work shows some preliminary experiments of coupling of a high resolution meteorological model with an Hydrological one.
Resumo:
MFA and LCA methodologies were applied to analyse the anthropogenic aluminium cycle in Italy with focus on historical evolution of stocks and flows of the metal, embodied GHG emissions, and potentials from recycling to provide key features to Italy for prioritizing industrial policy toward low-carbon technologies and materials. Historical trend series were collected from 1947 to 2009 and balanced with data from production, manufacturing and waste management of aluminium-containing products, using a ‘top-down’ approach to quantify the contemporary in-use stock of the metal, and helping to identify ‘applications where aluminium is not yet being recycled to its full potential and to identify present and future recycling flows’. The MFA results were used as a basis for the LCA aimed at evaluating the carbon footprint evolution, from primary and electrical energy, the smelting process and the transportation, embodied in the Italian aluminium. A discussion about how the main factors, according to the Kaya Identity equation, they did influence the Italian GHG emissions pattern over time, and which are the levers to mitigate it, it has been also reported. The contemporary anthropogenic reservoirs of aluminium was estimated at about 320 kg per capita, mainly embedded within the transportation and building and construction sectors. Cumulative in-use stock represents approximately 11 years of supply at current usage rates (about 20 Mt versus 1.7 Mt/year), and it would imply a potential of about 160 Mt of CO2eq emissions savings. A discussion of criticality related to aluminium waste recovery from the transportation and the containers and packaging sectors was also included in the study, providing an example for how MFA and LCA may support decision-making at sectorial or regional level. The research constitutes the first attempt of an integrated approach between MFA and LCA applied to the aluminium cycle in Italy.
Resumo:
One important metaphor, referred to biological theories, used to investigate on organizational and business strategy issues is the metaphor about heredity; an area requiring further investigation is the extent to which the characteristics of blueprints inherited from the parent, helps in explaining subsequent development of the spawned ventures. In order to shed a light on the tension between inherited patterns and the new trajectory that may characterize spawned ventures’ development we propose a model aimed at investigating which blueprints elements might exert an effect on business model design choices and to which extent their persistence (or abandonment) determines subsequent business model innovation. Under the assumption that academic and corporate institutions transmit different genes to their spin-offs, we hence expect to have heterogeneity in elements that affect business model design choices and its subsequent evolution. This is the reason why we carry on a twofold analysis in the biotech (meta)industry: under a multiple-case research design, business model and especially its fundamental design elements and themes scholars individuated to decompose the construct, have been thoroughly analysed. Our purpose is to isolate the dimensions of business model that may have been the object of legacy and the ones along which an experimentation and learning process is more likely to happen, bearing in mind that differences between academic and corporate might not be that evident as expected, especially considering that business model innovation may occur.
Resumo:
In recent years, the use of Reverse Engineering systems has got a considerable interest for a wide number of applications. Therefore, many research activities are focused on accuracy and precision of the acquired data and post processing phase improvements. In this context, this PhD Thesis deals with the definition of two novel methods for data post processing and data fusion between physical and geometrical information. In particular a technique has been defined for error definition in 3D points’ coordinates acquired by an optical triangulation laser scanner, with the aim to identify adequate correction arrays to apply under different acquisition parameters and operative conditions. Systematic error in data acquired is thus compensated, in order to increase accuracy value. Moreover, the definition of a 3D thermogram is examined. Object geometrical information and its thermal properties, coming from a thermographic inspection, are combined in order to have a temperature value for each recognizable point. Data acquired by an optical triangulation laser scanner are also used to normalize temperature values and make thermal data independent from thermal-camera point of view.