919 resultados para Conditional Logic
Resumo:
Interaction protocols establish how different computational entities can interact with each other. The interaction can be finalized to the exchange of data, as in 'communication protocols', or can be oriented to achieve some result, as in 'application protocols'. Moreover, with the increasing complexity of modern distributed systems, protocols are used also to control such a complexity, and to ensure that the system as a whole evolves with certain features. However, the extensive use of protocols has raised some issues, from the language for specifying them to the several verification aspects. Computational Logic provides models, languages and tools that can be effectively adopted to address such issues: its declarative nature can be exploited for a protocol specification language, while its operational counterpart can be used to reason upon such specifications. In this thesis we propose a proof-theoretic framework, called SCIFF, together with its extensions. SCIFF is based on Abductive Logic Programming, and provides a formal specification language with a clear declarative semantics (based on abduction). The operational counterpart is given by a proof procedure, that allows to reason upon the specifications and to test the conformance of given interactions w.r.t. a defined protocol. Moreover, by suitably adapting the SCIFF Framework, we propose solutions for addressing (1) the protocol properties verification (g-SCIFF Framework), and (2) the a-priori conformance verification of peers w.r.t. the given protocol (AlLoWS Framework). We introduce also an agent based architecture, the SCIFF Agent Platform, where the same protocol specification can be used to program and to ease the implementation task of the interacting peers.
Resumo:
Sustainable computer systems require some flexibility to adapt to environmental unpredictable changes. A solution lies in autonomous software agents which can adapt autonomously to their environments. Though autonomy allows agents to decide which behavior to adopt, a disadvantage is a lack of control, and as a side effect even untrustworthiness: we want to keep some control over such autonomous agents. How to control autonomous agents while respecting their autonomy? A solution is to regulate agents’ behavior by norms. The normative paradigm makes it possible to control autonomous agents while respecting their autonomy, limiting untrustworthiness and augmenting system compliance. It can also facilitate the design of the system, for example, by regulating the coordination among agents. However, an autonomous agent will follow norms or violate them in some conditions. What are the conditions in which a norm is binding upon an agent? While autonomy is regarded as the driving force behind the normative paradigm, cognitive agents provide a basis for modeling the bindingness of norms. In order to cope with the complexity of the modeling of cognitive agents and normative bindingness, we adopt an intentional stance. Since agents are embedded into a dynamic environment, things may not pass at the same instant. Accordingly, our cognitive model is extended to account for some temporal aspects. Special attention is given to the temporal peculiarities of the legal domain such as, among others, the time in force and the time in efficacy of provisions. Some types of normative modifications are also discussed in the framework. It is noteworthy that our temporal account of legal reasoning is integrated to our commonsense temporal account of cognition. As our intention is to build sustainable reasoning systems running unpredictable environment, we adopt a declarative representation of knowledge. A declarative representation of norms will make it easier to update their system representation, thus facilitating system maintenance; and to improve system transparency, thus easing system governance. Since agents are bounded and are embedded into unpredictable environments, and since conflicts may appear amongst mental states and norms, agent reasoning has to be defeasible, i.e. new pieces of information can invalidate formerly derivable conclusions. In this dissertation, our model is formalized into a non-monotonic logic, namely into a temporal modal defeasible logic, in order to account for the interactions between normative systems and software cognitive agents.
Resumo:
Several activities were conducted during my PhD activity. For the NEMO experiment a collaboration between the INFN/University groups of Catania and Bologna led to the development and production of a mixed signal acquisition board for the Nemo Km3 telescope. The research concerned the feasibility study for a different acquisition technique quite far from that adopted in the NEMO Phase 1 telescope. The DAQ board that we realized exploits the LIRA06 front-end chip for the analog acquisition of anodic an dynodic sources of a PMT (Photo-Multiplier Tube). The low-power analog acquisition allows to sample contemporaneously multiple channels of the PMT at different gain factors in order to increase the signal response linearity over a wider dynamic range. Also the auto triggering and self-event-classification features help to improve the acquisition performance and the knowledge on the neutrino event. A fully functional interface towards the first level data concentrator, the Floor Control Module, has been integrated as well on the board, and a specific firmware has been realized to comply with the present communication protocols. This stage of the project foresees the use of an FPGA, a high speed configurable device, to provide the board with a flexible digital logic control core. After the validation of the whole front-end architecture this feature would be probably integrated in a common mixed-signal ASIC (Application Specific Integrated Circuit). The volatile nature of the configuration memory of the FPGA implied the integration of a flash ISP (In System Programming) memory and a smart architecture for a safe remote reconfiguration of it. All the integrated features of the board have been tested. At the Catania laboratory the behavior of the LIRA chip has been investigated in the digital environment of the DAQ board and we succeeded in driving the acquisition with the FPGA. The PMT pulses generated with an arbitrary waveform generator were correctly triggered and acquired by the analog chip, and successively they were digitized by the on board ADC under the supervision of the FPGA. For the communication towards the data concentrator a test bench has been realized in Bologna where, thanks to a lending of the Roma University and INFN, a full readout chain equivalent to that present in the NEMO phase-1 was installed. These tests showed a good behavior of the digital electronic that was able to receive and to execute command imparted by the PC console and to answer back with a reply. The remotely configurable logic behaved well too and demonstrated, at least in principle, the validity of this technique. A new prototype board is now under development at the Catania laboratory as an evolution of the one described above. This board is going to be deployed within the NEMO Phase-2 tower in one of its floors dedicated to new front-end proposals. This board will integrate a new analog acquisition chip called SAS (Smart Auto-triggering Sampler) introducing thus a new analog front-end but inheriting most of the digital logic present in the current DAQ board discussed in this thesis. For what concern the activity on high-resolution vertex detectors, I worked within the SLIM5 collaboration for the characterization of a MAPS (Monolithic Active Pixel Sensor) device called APSEL-4D. The mentioned chip is a matrix of 4096 active pixel sensors with deep N-well implantations meant for charge collection and to shield the analog electronics from digital noise. The chip integrates the full-custom sensors matrix and the sparsifification/readout logic realized with standard-cells in STM CMOS technology 130 nm. For the chip characterization a test-beam has been set up on the 12 GeV PS (Proton Synchrotron) line facility at CERN of Geneva (CH). The collaboration prepared a silicon strip telescope and a DAQ system (hardware and software) for data acquisition and control of the telescope that allowed to store about 90 million events in 7 equivalent days of live-time of the beam. My activities concerned basically the realization of a firmware interface towards and from the MAPS chip in order to integrate it on the general DAQ system. Thereafter I worked on the DAQ software to implement on it a proper Slow Control interface of the APSEL4D. Several APSEL4D chips with different thinning have been tested during the test beam. Those with 100 and 300 um presented an overall efficiency of about 90% imparting a threshold of 450 electrons. The test-beam allowed to estimate also the resolution of the pixel sensor providing good results consistent with the pitch/sqrt(12) formula. The MAPS intrinsic resolution has been extracted from the width of the residual plot taking into account the multiple scattering effect.
Resumo:
The advent of distributed and heterogeneous systems has laid the foundation for the birth of new architectural paradigms, in which many separated and autonomous entities collaborate and interact to the aim of achieving complex strategic goals, impossible to be accomplished on their own. A non exhaustive list of systems targeted by such paradigms includes Business Process Management, Clinical Guidelines and Careflow Protocols, Service-Oriented and Multi-Agent Systems. It is largely recognized that engineering these systems requires novel modeling techniques. In particular, many authors are claiming that an open, declarative perspective is needed to complement the closed, procedural nature of the state of the art specification languages. For example, the ConDec language has been recently proposed to target the declarative and open specification of Business Processes, overcoming the over-specification and over-constraining issues of classical procedural approaches. On the one hand, the success of such novel modeling languages strongly depends on their usability by non-IT savvy: they must provide an appealing, intuitive graphical front-end. On the other hand, they must be prone to verification, in order to guarantee the trustworthiness and reliability of the developed model, as well as to ensure that the actual executions of the system effectively comply with it. In this dissertation, we claim that Computational Logic is a suitable framework for dealing with the specification, verification, execution, monitoring and analysis of these systems. We propose to adopt an extended version of the ConDec language for specifying interaction models with a declarative, open flavor. We show how all the (extended) ConDec constructs can be automatically translated to the CLIMB Computational Logic-based language, and illustrate how its corresponding reasoning techniques can be successfully exploited to provide support and verification capabilities along the whole life cycle of the targeted systems.
Resumo:
Human reasoning is a fascinating and complex cognitive process that can be applied in different research areas such as philosophy, psychology, laws and financial. Unfortunately, developing supporting software (to those different areas) able to cope such as complex reasoning it’s difficult and requires a suitable logic abstract formalism. In this thesis we aim to develop a program, that has the job to evaluate a theory (a set of rules) w.r.t. a Goal, and provide some results such as “The Goal is derivable from the KB5 (of the theory)”. In order to achieve this goal we need to analyse different logics and choose the one that best meets our needs. In logic, usually, we try to determine if a given conclusion is logically implied by a set of assumptions T (theory). However, when we deal with programming logic we need an efficient algorithm in order to find such implications. In this work we use a logic rather similar to human logic. Indeed, human reasoning requires an extension of the first order logic able to reach a conclusion depending on not definitely true6 premises belonging to a incomplete set of knowledge. Thus, we implemented a defeasible logic7 framework able to manipulate defeasible rules. Defeasible logic is a non-monotonic logic designed for efficient defeasible reasoning by Nute (see Chapter 2). Those kind of applications are useful in laws area especially if they offer an implementation of an argumentation framework that provides a formal modelling of game. Roughly speaking, let the theory is the set of laws, a keyclaim is the conclusion that one of the party wants to prove (and the other one wants to defeat) and adding dynamic assertion of rules, namely, facts putted forward by the parties, then, we can play an argumentative challenge between two players and decide if the conclusion is provable or not depending on the different strategies performed by the players. Implementing a game model requires one more meta-interpreter able to evaluate the defeasible logic framework; indeed, according to Göedel theorem (see on page 127), we cannot evaluate the meaning of a language using the tools provided by the language itself, but we need a meta-language able to manipulate the object language8. Thus, rather than a simple meta-interpreter, we propose a Meta-level containing different Meta-evaluators. The former has been explained above, the second one is needed to perform the game model, and the last one will be used to change game execution and tree derivation strategies.
Resumo:
Recently in most of the industrial automation process an ever increasing degree of automation has been observed. This increasing is motivated by the higher requirement of systems with great performance in terms of quality of products/services generated, productivity, efficiency and low costs in the design, realization and maintenance. This trend in the growth of complex automation systems is rapidly spreading over automated manufacturing systems (AMS), where the integration of the mechanical and electronic technology, typical of the Mechatronics, is merging with other technologies such as Informatics and the communication networks. An AMS is a very complex system that can be thought constituted by a set of flexible working stations, one or more transportation systems. To understand how this machine are important in our society let considerate that every day most of us use bottles of water or soda, buy product in box like food or cigarets and so on. Another important consideration from its complexity derive from the fact that the the consortium of machine producers has estimated around 350 types of manufacturing machine. A large number of manufacturing machine industry are presented in Italy and notably packaging machine industry,in particular a great concentration of this kind of industry is located in Bologna area; for this reason the Bologna area is called “packaging valley”. Usually, the various parts of the AMS interact among them in a concurrent and asynchronous way, and coordinate the parts of the machine to obtain a desiderated overall behaviour is an hard task. Often, this is the case in large scale systems, organized in a modular and distributed manner. Even if the success of a modern AMS from a functional and behavioural point of view is still to attribute to the design choices operated in the definition of the mechanical structure and electrical electronic architecture, the system that governs the control of the plant is becoming crucial, because of the large number of duties associated to it. Apart from the activity inherent to the automation of themachine cycles, the supervisory system is called to perform other main functions such as: emulating the behaviour of traditional mechanical members thus allowing a drastic constructive simplification of the machine and a crucial functional flexibility; dynamically adapting the control strategies according to the different productive needs and to the different operational scenarios; obtaining a high quality of the final product through the verification of the correctness of the processing; addressing the operator devoted to themachine to promptly and carefully take the actions devoted to establish or restore the optimal operating conditions; managing in real time information on diagnostics, as a support of the maintenance operations of the machine. The kind of facilities that designers can directly find on themarket, in terms of software component libraries provides in fact an adequate support as regard the implementation of either top-level or bottom-level functionalities, typically pertaining to the domains of user-friendly HMIs, closed-loop regulation and motion control, fieldbus-based interconnection of remote smart devices. What is still lacking is a reference framework comprising a comprehensive set of highly reusable logic control components that, focussing on the cross-cutting functionalities characterizing the automation domain, may help the designers in the process of modelling and structuring their applications according to the specific needs. Historically, the design and verification process for complex automated industrial systems is performed in empirical way, without a clear distinction between functional and technological-implementation concepts and without a systematic method to organically deal with the complete system. Traditionally, in the field of analog and digital control design and verification through formal and simulation tools have been adopted since a long time ago, at least for multivariable and/or nonlinear controllers for complex time-driven dynamics as in the fields of vehicles, aircrafts, robots, electric drives and complex power electronics equipments. Moving to the field of logic control, typical for industrial manufacturing automation, the design and verification process is approached in a completely different way, usually very “unstructured”. No clear distinction between functions and implementations, between functional architectures and technological architectures and platforms is considered. Probably this difference is due to the different “dynamical framework”of logic control with respect to analog/digital control. As a matter of facts, in logic control discrete-events dynamics replace time-driven dynamics; hence most of the formal and mathematical tools of analog/digital control cannot be directly migrated to logic control to enlighten the distinction between functions and implementations. In addition, in the common view of application technicians, logic control design is strictly connected to the adopted implementation technology (relays in the past, software nowadays), leading again to a deep confusion among functional view and technological view. In Industrial automation software engineering, concepts as modularity, encapsulation, composability and reusability are strongly emphasized and profitably realized in the so-calledobject-oriented methodologies. Industrial automation is receiving lately this approach, as testified by some IEC standards IEC 611313, IEC 61499 which have been considered in commercial products only recently. On the other hand, in the scientific and technical literature many contributions have been already proposed to establish a suitable modelling framework for industrial automation. During last years it was possible to note a considerable growth in the exploitation of innovative concepts and technologies from ICT world in industrial automation systems. For what concerns the logic control design, Model Based Design (MBD) is being imported in industrial automation from software engineering field. Another key-point in industrial automated systems is the growth of requirements in terms of availability, reliability and safety for technological systems. In other words, the control system should not only deal with the nominal behaviour, but should also deal with other important duties, such as diagnosis and faults isolations, recovery and safety management. Indeed, together with high performance, in complex systems fault occurrences increase. This is a consequence of the fact that, as it typically occurs in reliable mechatronic systems, in complex systems such as AMS, together with reliable mechanical elements, an increasing number of electronic devices are also present, that are more vulnerable by their own nature. The diagnosis problem and the faults isolation in a generic dynamical system consists in the design of an elaboration unit that, appropriately processing the inputs and outputs of the dynamical system, is also capable of detecting incipient faults on the plant devices, reconfiguring the control system so as to guarantee satisfactory performance. The designer should be able to formally verify the product, certifying that, in its final implementation, it will perform itsrequired function guarantying the desired level of reliability and safety; the next step is that of preventing faults and eventually reconfiguring the control system so that faults are tolerated. On this topic an important improvement to formal verification of logic control, fault diagnosis and fault tolerant control results derive from Discrete Event Systems theory. The aimof this work is to define a design pattern and a control architecture to help the designer of control logic in industrial automated systems. The work starts with a brief discussion on main characteristics and description of industrial automated systems on Chapter 1. In Chapter 2 a survey on the state of the software engineering paradigm applied to industrial automation is discussed. Chapter 3 presentes a architecture for industrial automated systems based on the new concept of Generalized Actuator showing its benefits, while in Chapter 4 this architecture is refined using a novel entity, the Generalized Device in order to have a better reusability and modularity of the control logic. In Chapter 5 a new approach will be present based on Discrete Event Systems for the problemof software formal verification and an active fault tolerant control architecture using online diagnostic. Finally conclusive remarks and some ideas on new directions to explore are given. In Appendix A are briefly reported some concepts and results about Discrete Event Systems which should help the reader in understanding some crucial points in chapter 5; while in Appendix B an overview on the experimental testbed of the Laboratory of Automation of University of Bologna, is reported to validated the approach presented in chapter 3, chapter 4 and chapter 5. In Appendix C some components model used in chapter 5 for formal verification are reported.
Resumo:
Die TGFbeta/BMP Signaltransduktionskaskade ist wichtig für viele Entwicklungsprozesse fast aller embryonaler sowie extraembryonaler Gewebe und sie ist ebenso essentiell bei der Aufrechterhaltung der Homöostase im adulten Organismus. In vielen Mausmodellen und Zellkulturversuchen wurde gezeigt, dass Liganden dieses Signalweges in verschiedene Stadien der Knorpel- und Knochenentwicklung involviert sind. BMPs sind beispielsweise maßgeblich an der frühen Kondensation und Bildung des Knorpels und später an Proliferation und Hypertrophie der Chondrozyten beteiligt. BMPs können ektopisch Knochenbildung auslösen und das Expressionsmuster der Liganden und spezifischen Rezeptoren in der Wachstumsfuge lässt auf eine wichtige Rolle der BMPs in der Wachstumsfuge schließen. Der gezielte knock out der BMP-Rezeptoren Bmpr1a und Bmpr1b in proliferierenden Chondrozyten führt zur Ausbildung einer generellen Chondrodysplasie. Smad1, Smad5 und Smad8 sind die Mediatoren der BMP-Signalkaskade. Im Rahmen der vorliegenden Arbeit sollte die Rolle und Funktion der Smad1- und Smad5-Proteine in der Wachstumsfuge untersucht werden. Hierzu wurden konditionale Smad1-knock out-Mäuse mit einer transgenen Mauslinie gekreuzt, die die Cre-Rekombinase spezifisch in proliferierenden Chondrozyten exprimiert. Diese Mäuse wurden mit und ohne heterozygotem Smad5-Hintergrund charakterisiert. Bei einem knock out von Smad1 allein konnte ein leichte Verkürzung der Wachstumsfuge beobachtet werden, wobei prähypertrophe und hypertrophe Zone gleichermaßen betroffen waren. Dieser Phänotyp war verstärkt in Mäusen mit zusätzlichem heterozygotem Smad5-Hintergrund. Eine Verringerung der Proliferationsrate konnte zusammen mit einer verminderten Ihh-Expression nachgewiesen werden. Zusätzlich konnte anhand von Röntgenaufnahmen eine Dysorganisation der nasalen Region und ein fehlendes nasales Septum beobachtet werden. Produktion und Mineralisation der extrazellulären Matrix waren nicht beeinträchtigt. Um die Rolle der BMP- und TGFbeta-Signalkaskaden während der endochondralen Ossifikation zu vergleichen, wurden transgene Mäuse generiert, in denen die TGFbeta-Signalkaskade spezifisch in proliferierenden Chondrozyten gestört war. Zwei Mauslinien, die ähnliche Phänotypen zeigten, wurden untersucht. Esl1 ist ein TGFbeta-bindendes Protein, von dem man annimmt, dass es die TGFbeta-Signalkaskade inhibieren kann. Esl1-knock out-Mäuse sind kleiner als Wildtypmäuse und die Überexpression von Esl1 in proliferierenden Chondrozyten führt zu einer Verlängerung der Wachstumsfuge und einer verstärkten Proliferationsrate. Knorpelmarker, wie Col2a1 und Sox9 sind in diesen Mäusen herunterreguliert, während Col10a1 und Ihh als Marker für die hypertrophe und prähypertrophe Zone herunterreguliert waren. Dies führt zu der Annahme, dass mehr Zellen in die terminale Differenzierung eintreten. Bei transgenen Mäusen, in denen ein dominant-negativer (dn) TGFbeta-Rezeptor in proliferierenden Chondrozyten überexprimiert wurde, konnte eine verlängerte prähypertrophe Zone, eine erhöhte Ihh-Expression, sowie eine verstärkte Proliferationsrate beobachtet werden. Zusätzlich konnte in homozygoten Tieren ein craniofacialer Phänotyp beschrieben werden, der zu Problemen bei der Nahrungsaufnahme und damit zu einer starken Wachstumsbeeinträchtigung führte. Die BMP- und TGFbeta-Signalkaskaden haben möglicherweise antagonistische Effekte in der Wachstumsfuge. Während der Ausfall von BMP in proliferierenden Chondrozyten aufgrund einer gesunkenen Proliferationsrate zu einer Verkürzung der Wachstumsfuge führte, kann man in Mäusen mit einer Störung der TGFbeta-Signalkaskade eine verstärkte Proliferation in einer daher verlängerten Wachstumsfuge beobachten. Ein weiteres Ziel dieser Arbeit war die Generation einer transgenen Mauslinie, die die Cre-Rekombinase spezifisch in hypertrophen Chondrozyten exprimiert. Promoterstudien mit transgenen Mäusen weisen darauf hin, dass ein putatives AP1-Element, etwa 4 kb vor dem ersten Exon des Col10a1 gelegen, wichtig für die spezifische Expression in hypertrophen Chondrozyten ist. Ein Konstrukt, dass vier Kopien dieses Elements und den basalen Promoter enthält, wurde benutzt, um die Cre-Rekombinase spezifisch zu exprimieren. Diese Mauslinie befindet sich in der Testphase und erste Daten deuten auf eine spezifische Expression der Cre-Rekombinase in hypertrophen Chondrozyten hin.
Resumo:
Implicazioni tettoniche ed estetiche delle logiche monoscocca integrate e stress lines analysis in architettura.
Resumo:
La tesi tematizza come proprio oggetto di indagine i percorsi di partecipazione politica e civica dei giovani nei contesti di transizione alla vita adulta, concentrandosi sull’influenza delle relazioni tra generazioni su tali espressioni di coinvolgimento. L’approfondimento empirico consiste in una ricerca qualitativa condotta presso il quartiere Navile di Bologna nel 2012. Basandosi sull’approccio metodologico della grounded theory, essa ha coinvolto un campione di giovani e un campione di adulti per loro significativi attraverso interviste semistrutturate. Dall’analisi emerge una rilevante disaffezione giovanile nei confronti della politica che, tuttavia, non traduce in un rifiuto del coinvolgimento, ma in una “partecipazione con riserva” espressa attraverso atteggiamenti tutt’altro che passivi nei confronti della politica formale - basati sulla logica della riforma, della resistenza o della ribellione - e mediante un forte investimento in attività partecipative non convenzionali (associazionismo e coinvolgimento). A fare da sfondo all’interesse partecipativo dei giovani si colloca una lettura negativa della propria condizione presente ed un conseguente conflitto intergenerazionale piuttosto manifesto, che si riflette sulle stesse modalità di attivazione. La politica, nelle sue espressioni più strettamente formali, viene interpretata come un ‘territorio adulto’, gestito secondo logiche che lasciano poco spazio ai giovani i quali, per tale ragione, scelgono di attivarsi secondo modalità alternative in cui il confronto con l’altro, quando presente, avviene prevalentemente tra pari o su basi avvertite come più paritarie. Il distanziamento dei giovani dalla politica formale riflette quindi una parallela presa di distanza dagli adulti, i quali risultano smarriti nello svolgimento delle loro funzioni di modello e di riconoscimento. La loro ambivalenza rispetto ai giovani - ossia il continuo oscillare tra il profondo pessimismo e il cieco ottimismo, tra la guida direttiva e la deresponsabilizzazione - si traduce in un riconoscimento parziale delle reali potenzialità ed esigenze dei giovani come cittadini ed adulti emergenti.
Resumo:
Delineating brain tumor boundaries from magnetic resonance images is an essential task for the analysis of brain cancer. We propose a fully automatic method for brain tissue segmentation, which combines Support Vector Machine classification using multispectral intensities and textures with subsequent hierarchical regularization based on Conditional Random Fields. The CRF regularization introduces spatial constraints to the powerful SVM classification, which assumes voxels to be independent from their neighbors. The approach first separates healthy and tumor tissue before both regions are subclassified into cerebrospinal fluid, white matter, gray matter and necrotic, active, edema region respectively in a novel hierarchical way. The hierarchical approach adds robustness and speed by allowing to apply different levels of regularization at different stages. The method is fast and tailored to standard clinical acquisition protocols. It was assessed on 10 multispectral patient datasets with results outperforming previous methods in terms of segmentation detail and computation times.
Resumo:
Portal hypertension (PH) is a common complication and a leading cause of death in patients with chronic liver diseases. PH is underlined by structural and functional derangement of liver sinusoid vessels and its fenestrated endothelium. Because in most clinical settings PH is accompanied by parenchymal injury, it has been difficult to determine the precise role of microvascular perturbations in causing PH. Reasoning that Vascular Endothelial Growth Factor (VEGF) is required to maintain functional integrity of the hepatic microcirculation, we developed a transgenic mouse system for a liver-specific-, reversible VEGF inhibition. The system is based on conditional induction and de-induction of a VEGF decoy receptor that sequesters VEGF and preclude signaling. VEGF blockade results in sinusoidal endothelial cells (SECs) fenestrations closure and in accumulation and transformation of the normally quiescent hepatic stellate cells, i.e. provoking the two processes underlying sinusoidal capillarization. Importantly, sinusoidal capillarization was sufficient to cause PH and its typical sequela, ascites, splenomegaly and venous collateralization without inflicting parenchymal damage or fibrosis. Remarkably, these dramatic phenotypes were fully reversed within few days from lifting-off VEGF blockade and resultant re-opening of SECs' fenestrations. This study not only uncovered an indispensible role for VEGF in maintaining structure and function of mature SECs, but also highlights the vasculo-centric nature of PH pathogenesis. Unprecedented ability to rescue PH and its secondary manifestations via manipulating a single vascular factor may also be harnessed for examining the potential utility of de-capillarization treatment modalities.