983 resultados para Minimal Systems
Resumo:
Dynamic and distributed environments are hard to model since they suffer from unexpected changes, incomplete knowledge, and conflicting perspectives and, thus, call for appropriate knowledge representation and reasoning (KRR) systems. Such KRR systems must handle sets of dynamic beliefs, be sensitive to communicated and perceived changes in the environment and, consequently, may have to drop current beliefs in face of new findings or disregard any new data that conflicts with stronger convictions held by the system. Not only do they need to represent and reason with beliefs, but also they must perform belief revision to maintain the overall consistency of the knowledge base. One way of developing such systems is to use reason maintenance systems (RMS). In this paper we provide an overview of the most representative types of RMS, which are also known as truth maintenance systems (TMS), which are computational instances of the foundations-based theory of belief revision. An RMS module works together with a problem solver. The latter feeds the RMS with assumptions (core beliefs) and conclusions (derived beliefs), which are accompanied by their respective foundations. The role of the RMS module is to store the beliefs, associate with each belief (core or derived belief) the corresponding set of supporting foundations and maintain the consistency of the overall reasoning by keeping, for each represented belief, the current supporting justifications. Two major approaches are used to reason maintenance: single-and multiple-context reasoning systems. Although in the single-context systems, each belief is associated to the beliefs that directly generated it—the justification-based TMS (JTMS) or the logic-based TMS (LTMS), in the multiple context counterparts, each belief is associated with the minimal set of assumptions from which it can be inferred—the assumption-based TMS (ATMS) or the multiple belief reasoner (MBR).
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do grau de Mestre em Engenharia Electrotécnica e de Computadores
Resumo:
We exhibit the construction of stable arc exchange systems from the stable laminations of hyperbolic diffeomorphisms. We prove a one-to-one correspondence between (i) Lipshitz conjugacy classes of C(1+H) stable arc exchange systems that are C(1+H) fixed points of renormalization and (ii) Lipshitz conjugacy classes of C(1+H) diffeomorphisms f with hyperbolic basic sets Lambda that admit an invariant measure absolutely continuous with respect to the Hausdorff measure on Lambda. Let HD(s)(Lambda) and HD(u)(Lambda) be, respectively, the Hausdorff dimension of the stable and unstable leaves intersected with the hyperbolic basic set L. If HD(u)(Lambda) = 1, then the Lipschitz conjugacy is, in fact, a C(1+H) conjugacy in (i) and (ii). We prove that if the stable arc exchange system is a C(1+HDs+alpha) fixed point of renormalization with bounded geometry, then the stable arc exchange system is smooth conjugate to an affine stable arc exchange system.
Resumo:
Pseudomonas aeruginosa utilizes preferentially C(4)-dicarboxylates such as malate, fumarate, and succinate as carbon and energy sources. We have identified and characterized two C(4)-dicarboxylate transport (Dct) systems in P. aeruginosa PAO1. Inactivation of the dctA(PA1183) gene caused a growth defect of the strain in minimal media supplemented with succinate, fumarate or malate, indicating that DctA has a major role in Dct. However, residual growth of the dctA mutant in these media suggested the presence of additional C(4)-dicarboxylate transporter(s). Tn5 insertion mutagenesis of the ΔdctA mutant led to the identification of a second Dct system, i.e., the DctPQM transporter belonging to the tripartite ATP-independent periplasmic (TRAP) family of carriers. The ΔdctA ΔdctPQM double mutant showed no growth on malate and fumarate and residual growth on succinate, suggesting that DctA and DctPQM are the only malate and fumarate transporters, whereas additional transporters for succinate are present. Using lacZ reporter fusions, we showed that the expression of the dctA gene and the dctPQM operon was enhanced in early exponential growth phase and induced by C(4)-dicarboxylates. Competition experiments demonstrated that the DctPQM carrier was more efficient than the DctA carrier for the utilization of succinate at micromolar concentrations, whereas DctA was the major transporter at millimolar concentrations. To conclude, this is the first time that the high- and low-affinity uptake systems for succinate DctA and DctPQM have been reported to function coordinately to transport C(4)-dicarboxylates and that the alternative sigma factor RpoN and a DctB/DctD two-component system regulates simultaneously the dctA gene and the dctPQM operon.
Resumo:
Positive-operator-valued measurements on a finite number of N identically prepared systems of arbitrary spin J are discussed. Pure states are characterized in terms of Bloch-like vectors restricted by a SU(2J+1) covariant constraint. This representation allows for a simple description of the equations to be fulfilled by optimal measurements. We explicitly find the minimal positive-operator-valued measurement for the N=2 case, a rigorous bound for N=3, and set up the analysis for arbitrary N.
Resumo:
Optimal and finite positive operator valued measurements on a finite number N of identically prepared systems have recently been presented. With physical realization in mind, we propose here optimal and minimal generalized quantum measurements for two-level systems. We explicitly construct them up to N = 7 and verify that they are minimal up to N = 5.
Resumo:
The safe and responsible development of engineered nanomaterials (ENM), nanotechnology-based materials and products, together with the definition of regulatory measures and implementation of "nano"-legislation in Europe require a widely supported scientific basis and sufficient high quality data upon which to base decisions. At the very core of such a scientific basis is a general agreement on key issues related to risk assessment of ENMs which encompass the key parameters to characterise ENMs, appropriate methods of analysis and best approach to express the effect of ENMs in widely accepted dose response toxicity tests. The following major conclusions were drawn: Due to high batch variability of ENMs characteristics of commercially available and to a lesser degree laboratory made ENMs it is not possible to make general statements regarding the toxicity resulting from exposure to ENMs. 1) Concomitant with using the OECD priority list of ENMs, other criteria for selection of ENMs like relevance for mechanistic (scientific) studies or risk assessment-based studies, widespread availability (and thus high expected volumes of use) or consumer concern (route of consumer exposure depending on application) could be helpful. The OECD priority list is focussing on validity of OECD tests. Therefore source material will be first in scope for testing. However for risk assessment it is much more relevant to have toxicity data from material as present in products/matrices to which men and environment are be exposed. 2) For most, if not all characteristics of ENMs, standardized methods analytical methods, though not necessarily validated, are available. Generally these methods are only able to determine one single characteristic and some of them can be rather expensive. Practically, it is currently not feasible to fully characterise ENMs. Many techniques that are available to measure the same nanomaterial characteristic produce contrasting results (e.g. reported sizes of ENMs). It was recommended that at least two complementary techniques should be employed to determine a metric of ENMs. The first great challenge is to prioritise metrics which are relevant in the assessment of biological dose response relations and to develop analytical methods for characterising ENMs in biological matrices. It was generally agreed that one metric is not sufficient to describe fully ENMs. 3) Characterisation of ENMs in biological matrices starts with sample preparation. It was concluded that there currently is no standard approach/protocol for sample preparation to control agglomeration/aggregation and (re)dispersion. It was recommended harmonization should be initiated and that exchange of protocols should take place. The precise methods used to disperse ENMs should be specifically, yet succinctly described within the experimental section of a publication. 4) ENMs need to be characterised in the matrix as it is presented to the test system (in vitro/ in vivo). 5) Alternative approaches (e.g. biological or in silico systems) for the characterisation of ENMS are simply not possible with the current knowledge. Contributors: Iseult Lynch, Hans Marvin, Kenneth Dawson, Markus Berges, Diane Braguer, Hugh J. Byrne, Alan Casey, Gordon Chambers, Martin Clift, Giuliano Elia1, Teresa F. Fernandes, Lise Fjellsbø, Peter Hatto, Lucienne Juillerat, Christoph Klein, Wolfgang Kreyling, Carmen Nickel1, and Vicki Stone.
Resumo:
Angiogenesis plays a key role in tumor growth and cancer progression. TIE-2-expressing monocytes (TEM) have been reported to critically account for tumor vascularization and growth in mouse tumor experimental models, but the molecular basis of their pro-angiogenic activity are largely unknown. Moreover, differences in the pro-angiogenic activity between blood circulating and tumor infiltrated TEM in human patients has not been established to date, hindering the identification of specific targets for therapeutic intervention. In this work, we investigated these differences and the phenotypic reversal of breast tumor pro-angiogenic TEM to a weak pro-angiogenic phenotype by combining Boolean modelling and experimental approaches. Firstly, we show that in breast cancer patients the pro-angiogenic activity of TEM increased drastically from blood to tumor, suggesting that the tumor microenvironment shapes the highly pro-angiogenic phenotype of TEM. Secondly, we predicted in silico all minimal perturbations transitioning the highly pro-angiogenic phenotype of tumor TEM to the weak pro-angiogenic phenotype of blood TEM and vice versa. In silico predicted perturbations were validated experimentally using patient TEM. In addition, gene expression profiling of TEM transitioned to a weak pro-angiogenic phenotype confirmed that TEM are plastic cells and can be reverted to immunological potent monocytes. Finally, the relapse-free survival analysis showed a statistically significant difference between patients with tumors with high and low expression values for genes encoding transitioning proteins detected in silico and validated on patient TEM. In conclusion, the inferred TEM regulatory network accurately captured experimental TEM behavior and highlighted crosstalk between specific angiogenic and inflammatory signaling pathways of outstanding importance to control their pro-angiogenic activity. Results showed the successful in vitro reversion of such an activity by perturbation of in silico predicted target genes in tumor derived TEM, and indicated that targeting tumor TEM plasticity may constitute a novel valid therapeutic strategy in breast cancer.
Resumo:
The kinematics of the anatomical shoulder are analysed and modelled as a parallel mechanism similar to a Stewart platform. A new method is proposed to describe the shoulder kinematics with minimal coordinates and solve the indeterminacy. The minimal coordinates are defined from bony landmarks and the scapulothoracic kinematic constraints. Independent from one another, they uniquely characterise the shoulder motion. A humanoid mechanism is then proposed with identical kinematic properties. It is then shown how minimal coordinates can be obtained for this mechanism and how the coordinates simplify both the motion-planning task and trajectory-tracking control. Lastly, the coordinates are also shown to have an application in the field of biomechanics where they can be used to model the scapulohumeral rhythm.
Resumo:
This dissertation describes a networking approach to infinite-dimensional systems theory, where there is a minimal distinction between inputs and outputs. We introduce and study two closely related classes of systems, namely the state/signal systems and the port-Hamiltonian systems, and describe how they relate to each other. Some basic theory for these two classes of systems and the interconnections of such systems is provided. The main emphasis lies on passive and conservative systems, and the theoretical concepts are illustrated using the example of a lossless transfer line. Much remains to be done in this field and we point to some directions for future studies as well.
Resumo:
Software is a key component in many of our devices and products that we use every day. Most customers demand not only that their devices should function as expected but also that the software should be of high quality, reliable, fault tolerant, efficient, etc. In short, it is not enough that a calculator gives the correct result of a calculation, we want the result instantly, in the right form, with minimal use of battery, etc. One of the key aspects for succeeding in today's industry is delivering high quality. In most software development projects, high-quality software is achieved by rigorous testing and good quality assurance practices. However, today, customers are asking for these high quality software products at an ever-increasing pace. This leaves the companies with less time for development. Software testing is an expensive activity, because it requires much manual work. Testing, debugging, and verification are estimated to consume 50 to 75 per cent of the total development cost of complex software projects. Further, the most expensive software defects are those which have to be fixed after the product is released. One of the main challenges in software development is reducing the associated cost and time of software testing without sacrificing the quality of the developed software. It is often not enough to only demonstrate that a piece of software is functioning correctly. Usually, many other aspects of the software, such as performance, security, scalability, usability, etc., need also to be verified. Testing these aspects of the software is traditionally referred to as nonfunctional testing. One of the major challenges with non-functional testing is that it is usually carried out at the end of the software development process when most of the functionality is implemented. This is due to the fact that non-functional aspects, such as performance or security, apply to the software as a whole. In this thesis, we study the use of model-based testing. We present approaches to automatically generate tests from behavioral models for solving some of these challenges. We show that model-based testing is not only applicable to functional testing but also to non-functional testing. In its simplest form, performance testing is performed by executing multiple test sequences at once while observing the software in terms of responsiveness and stability, rather than the output. The main contribution of the thesis is a coherent model-based testing approach for testing functional and performance related issues in software systems. We show how we go from system models, expressed in the Unified Modeling Language, to test cases and back to models again. The system requirements are traced throughout the entire testing process. Requirements traceability facilitates finding faults in the design and implementation of the software. In the research field of model-based testing, many new proposed approaches suffer from poor or the lack of tool support. Therefore, the second contribution of this thesis is proper tool support for the proposed approach that is integrated with leading industry tools. We o er independent tools, tools that are integrated with other industry leading tools, and complete tool-chains when necessary. Many model-based testing approaches proposed by the research community suffer from poor empirical validation in an industrial context. In order to demonstrate the applicability of our proposed approach, we apply our research to several systems, including industrial ones.
Resumo:
The present study is carried out to understand (i) the incidence and occurrence of species of Vibrio in different culture systems in and around Cochin, (ii) characteristics of vibrios isolates, their ecology including growth response to various hydrological parameters, sensitivity to about 40 antibiotics, and (iii) role of physico-chemical parameters in pathogenicity of vibrios, etc. and the results emerged from the investigations are important and encouraging for better understanding of the 'vibriosis' in the culture systems to develop remedial measures to control diseases. The Thesis begins with an “Introduction” followed by “A review of literature” on diseases of penaeid shrimps with particular reference to 'vibriosis' and “Material and methods” which details with the methods and procedures followed in the experiments and analyses of data. This Thesis consists of three chapters. Chapter Ideals with the incidence and ecology of Vibrio spp. in water, sediment and in juveniles of the Indian white prawn Penaeus indicus in the culture systems. In Chapter 2, characteristics of vibrio isolates including growth response to various levels oftemperature, salinity and pl 1, sensitivity to 40 antibiotics and minimal inhibitory concentration tests are detailed.e out-breaks. The Chapter 3 discusses the role of physico-chemical parameters in the incidence, seasonal abundance of Vibrio spp. and in 'vibriosis'. A summary of the whole work and list of references are also included at the end. This study gives a detailed information regarding the incidence and ecology of vibrios in the culture systems. their characteristics and pathogenicity
Resumo:
Since no physical system can ever be completely isolated from its environment, the study of open quantum systems is pivotal to reliably and accurately control complex quantum systems. In practice, reliability of the control field needs to be confirmed via certification of the target evolution while accuracy requires the derivation of high-fidelity control schemes in the presence of decoherence. In the first part of this thesis an algebraic framework is presented that allows to determine the minimal requirements on the unique characterisation of arbitrary unitary gates in open quantum systems, independent on the particular physical implementation of the employed quantum device. To this end, a set of theorems is devised that can be used to assess whether a given set of input states on a quantum channel is sufficient to judge whether a desired unitary gate is realised. This allows to determine the minimal input for such a task, which proves to be, quite remarkably, independent of system size. These results allow to elucidate the fundamental limits regarding certification and tomography of open quantum systems. The combination of these insights with state-of-the-art Monte Carlo process certification techniques permits a significant improvement of the scaling when certifying arbitrary unitary gates. This improvement is not only restricted to quantum information devices where the basic information carrier is the qubit but it also extends to systems where the fundamental informational entities can be of arbitary dimensionality, the so-called qudits. The second part of this thesis concerns the impact of these findings from the point of view of Optimal Control Theory (OCT). OCT for quantum systems utilises concepts from engineering such as feedback and optimisation to engineer constructive and destructive interferences in order to steer a physical process in a desired direction. It turns out that the aforementioned mathematical findings allow to deduce novel optimisation functionals that significantly reduce not only the required memory for numerical control algorithms but also the total CPU time required to obtain a certain fidelity for the optimised process. The thesis concludes by discussing two problems of fundamental interest in quantum information processing from the point of view of optimal control - the preparation of pure states and the implementation of unitary gates in open quantum systems. For both cases specific physical examples are considered: for the former the vibrational cooling of molecules via optical pumping and for the latter a superconducting phase qudit implementation. In particular, it is illustrated how features of the environment can be exploited to reach the desired targets.
Resumo:
Almost 450 nuclear power plants are currently operating throughout the world and supplying about 17% of the world’s electricity. These plants perform safely, reliably, and have no free-release of byproducts to the environment. Given the current rate of growth in electricity demand and the ever growing concerns for the environment, the US consumer will favor energy sources that can satisfy the need for electricity and other energy-intensive products (1) on a sustainable basis with minimal environmental impact, (2) with enhanced reliability and safety and (3) competitive economics. Given that advances are made to fully apply the potential benefits of nuclear energy systems, the next generation of nuclear systems can provide a vital part of a long-term, diversified energy supply. The Department of Energy has begun research on such a new generation of nuclear energy systems that can be made available to the market by 2030 or earlier, and that can offer significant advances toward these challenging goals [1]. These future nuclear power systems will require advances in materials, reactor physics as well as heat transfer to realize their full potential. In this paper, a summary of these advanced nuclear power systems is presented along with a short synopsis of the important heat transfer issues. Given the nature of research and the dynamics of these conceptual designs, key aspects of the physics will be provided, with details left for the presentation.
Resumo:
El foc bacterià és una malaltia que afecta a plantes de la família de la rosàcies, causada pel bacteri Erwinia amylovora. El seu rang d'hostes inclou arbres fruiters, com la perera, la pomera o el codonyer, i plantes ornamentals de gran interès comercial i econòmic. Actualment, la malaltia s'ha dispersat i es troba àmpliament distribuïda en totes les zones de clima temperat del món. A Espanya, on la malaltia no és endèmica, el foc bacterià es va detectar per primer cop al 1995 al nord del país (Euskadi) i posteriorment, han aparegut varis focus en altres localitzacions, que han estat convenientment eradicats. El control del foc bacterià, és molt poc efectiu en plantes afectades per la malaltia, de manera que es basa en mesures encaminades a evitar la dispersió del patogen, i la introducció de la malaltia en regions no endèmiques. En aquest treball, la termoteràpia ha estat avaluada com a mètode d'eradicació d'E. amylovora de material vegetal de propagació asimptomàtic. S'ha demostrat que la termoteràpia és un mètode viable d'eradicar E. amylovora de material de propagació. Gairebé totes les espècies i varietats de rosàcies mantingudes en condicions d'humitat sobrevivien 7 hores a 45 ºC i més de 3 hores a 50 ºC, mentre que més d'1 hora d'exposició a 50 ºC amb calor seca produïa danys en el material vegetal i reduïa la brotació. Tractaments de 60 min a 45 ºC o 30 min a 50 ºC van ser suficients per reduir la població epífita d'E. amylovora a nivells no detectables (5 x 102 ufc g-1 p.f.) en branques de perera. Els derivats dels fosfonats i el benzotiadiazol són efectius en el control del foc bacterià en perera i pomera, tant en condicions de laboratori, com d'hivernacle i camp. Els inductors de defensa de les plantes redueixen els nivells de malaltia fins al 40-60%. Els intervals de temps mínims per aconseguir el millor control de la malaltia van ser 5 dies pel fosetil-Al, i 7 dies per l'etefon i el benzotiadiazol, i les dosis òptimes pel fosetil-Al i el benzotiadiazol van ser 3.72 g HPO32- L-1 i 150 mg i.a. L-1, respectivament. Es millora l'eficàcia del fosetil-Al i del benzotiadiazol en el control del foc bacterià, quan es combinen amb els antibiòtics a la meitat de la dosi d'aquests últims. Tot i que l'estratègia de barrejar productes és més pràctica i fàcil de dur a terme a camp, que l'estratègia de combinar productes, el millor nivell de control de la malaltia s'aconsegueix amb l'estratègia de combinar productes. Es va analitzar a nivell histològic i ultrastructural l'efecte del benzotiadiazol i dels fosfonats en la interacció Erwinia amylovora-perera. Ni el benzotiadiazol, ni el fosetil-Al, ni l'etefon van induir canvis estructurals en els teixits de perera 7 dies després de la seva aplicació. No obstant, després de la inoculació d'E. amylovora es va observar en plantes tractades amb fosetil-Al i etefon una desorganització estructural cel·lular, mentre que en les plantes tractades amb benzotiadiazol aquestes alteracions tissulars van ser retardades. S'han avaluat dos models (Maryblyt, Cougarblight) en un camp a Espanya afectat per la malaltia, per determinar la precisió de les prediccions. Es van utilitzar dos models per elaborar el mapa de risc, el BRS-Powell combinat i el BIS95 modificat. Els resultats van mostrar dos zones amb elevat i baix risc de la malaltia. Maryblyt i Cougarblight són dos models de fàcil ús, tot i que la seva implementació en programes de maneig de la malaltia requereix que siguin avaluats i validats per un període de temps més llarg i en àrees on la malaltia hi estigui present.