968 resultados para Open quantum systems
Resumo:
We determine numerically the single-particle and the two-particle spectrum of the three-state quantum Potts model on a lattice by using the density matrix renormalization group method, and extract information on the asymptotic (small momentum) S-matrix of the quasiparticles. The low energy part of the finite size spectrum can be understood in terms of a simple effective model introduced in a previous work, and is consistent with an asymptotic S-matrix of an exchange form below a momentum scale p*. This scale appears to vanish faster than the Compton scale, mc, as one approaches the critical point, suggesting that a dangerously irrelevant operator may be responsible for the behaviour observed on the lattice.
Resumo:
We present transport measurements on a system of two lateral quantum dots in a perpendicular magnetic field. Due to edge channel formation in an open conducting region, the quantum dots are chirally coupled. When both quantum dots are tuned into the Kondo regime simultaneously, we observe a change in the temperature dependence of the differential conductance. This is explained by the RKKY exchange interaction between the two dots. As a function of bias the differential conductance shows a splitting of the Kondo resonance which changes in the presence of RKKY interaction.
Resumo:
Intrusion Detection Systems (IDSs) provide an important layer of security for computer systems and networks, and are becoming more and more necessary as reliance on Internet services increases and systems with sensitive data are more commonly open to Internet access. An IDS’s responsibility is to detect suspicious or unacceptable system and network activity and to alert a systems administrator to this activity. The majority of IDSs use a set of signatures that define what suspicious traffic is, and Snort is one popular and actively developing open-source IDS that uses such a set of signatures known as Snort rules. Our aim is to identify a way in which Snort could be developed further by generalising rules to identify novel attacks. In particular, we attempted to relax and vary the conditions and parameters of current Snort rules, using a similar approach to classic rule learning operators such as generalisation and specialisation. We demonstrate the effectiveness of our approach through experiments with standard datasets and show that we are able to detect previously undetected variants of various attacks. We conclude by discussing the general effectiveness and appropriateness of generalisation in Snort based IDS rule processing. Keywords: anomaly detection, intrusion detection, Snort, Snort rules
Resumo:
In this dissertation I draw a connection between quantum adiabatic optimization, spectral graph theory, heat-diffusion, and sub-stochastic processes through the operators that govern these processes and their associated spectra. In particular, we study Hamiltonians which have recently become known as ``stoquastic'' or, equivalently, the generators of sub-stochastic processes. The operators corresponding to these Hamiltonians are of interest in all of the settings mentioned above. I predominantly explore the connection between the spectral gap of an operator, or the difference between the two lowest energies of that operator, and certain equilibrium behavior. In the context of adiabatic optimization, this corresponds to the likelihood of solving the optimization problem of interest. I will provide an instance of an optimization problem that is easy to solve classically, but leaves open the possibility to being difficult adiabatically. Aside from this concrete example, the work in this dissertation is predominantly mathematical and we focus on bounding the spectral gap. Our primary tool for doing this is spectral graph theory, which provides the most natural approach to this task by simply considering Dirichlet eigenvalues of subgraphs of host graphs. I will derive tight bounds for the gap of one-dimensional, hypercube, and general convex subgraphs. The techniques used will also adapt methods recently used by Andrews and Clutterbuck to prove the long-standing ``Fundamental Gap Conjecture''.
Resumo:
Climate change and carbon (C) sequestration are a major focus of research in the twenty-first century. Globally, soils store about 300 times the amount of C that is released per annum through the burning of fossil fuels (Schulze and Freibauer 2005). Land clearing and introduction of agricultural systems have led to rapid declines in soil C reserves. The recent introduction of conservation agricultural practices has not led to a reversing of the decline in soil C content, although it has minimized the rate of decline (Baker et al. 2007; Hulugalle and Scott 2008). Lal (2003) estimated the quantum of C pools in the atmosphere, terrestrial ecosystems, and oceans and reported a “missing C” component in the world C budget. Though not proven yet, this could be linked to C losses through runoff and soil erosion (Lal 2005) and a lack of C accounting in inland water bodies (Cole et al. 2007). Land management practices to minimize the microbial respiration and soil organic C (SOC) decline such as minimum tillage or no tillage were extensively studied in the past, and the soil erosion and runoff studies monitoring those management systems focused on other nutrients such as nitrogen (N) and phosphorus (P).
Resumo:
Intrusion Detection Systems (IDSs) provide an important layer of security for computer systems and networks, and are becoming more and more necessary as reliance on Internet services increases and systems with sensitive data are more commonly open to Internet access. An IDS’s responsibility is to detect suspicious or unacceptable system and network activity and to alert a systems administrator to this activity. The majority of IDSs use a set of signatures that define what suspicious traffic is, and Snort is one popular and actively developing open-source IDS that uses such a set of signatures known as Snort rules. Our aim is to identify a way in which Snort could be developed further by generalising rules to identify novel attacks. In particular, we attempted to relax and vary the conditions and parameters of current Snort rules, using a similar approach to classic rule learning operators such as generalisation and specialisation. We demonstrate the effectiveness of our approach through experiments with standard datasets and show that we are able to detect previously undetected variants of various attacks. We conclude by discussing the general effectiveness and appropriateness of generalisation in Snort based IDS rule processing. Keywords: anomaly detection, intrusion detection, Snort, Snort rules
Resumo:
This paper reports the results of a postal survey of intermediate care co-ordinators (ICCs) on the organization and delivery of intermediate care services for older people in England, conducted between November 2003 and May 2004. Questionnaires, which covered a range of issues with a variety of quantitative, ‘tick-box’ and open-ended questions, were returned by 106 respondents, representing just over 35% of primary care trusts (PCTs). We discuss the role of ICCs, the integration of local systems of intermediate care provision, and the form, function and model of delivery of services described by respondents. Using descriptive and statistical analysis of the responses, we highlight in particular the relationship between provision of admission avoidance and supported discharge, the availability of 24-hour care, and the locations in which care is provided, and relate our findings to the emerging evidence base for intermediate care, guidance on implementation from central government, and debate in the literature. Whilst the expansion and integration of intermediate care appear to be continuing apace, much provision seems concentrated in supported discharge services rather than acute admission avoidance, and particularly in residential forms of post-acute intermediate care. Supported discharge services tend to be found in residential settings, while admission avoidance provision tends to be non-residential in nature. Twenty-four hour care in non-residential settings is not available in several responding PCTs. These findings raise questions about the relationship between the implementation of intermediate care and the evidence for and aims of the policy as part of NHS modernization, and the extent to which intermediate care represents a genuinely novel approach to the care and rehabilitation of older people.
Resumo:
Finding equilibration times is a major unsolved problem in physics with few analytical results. Here we look at equilibration times for quantum gases of bosons and fermions in the regime of negligibly weak interactions, a setting which not only includes paradigmatic systems such as gases confined to boxes, but also Luttinger liquids and the free superfluid Hubbard model. To do this, we focus on two classes of measurements: (i) coarse-grained observables, such as the number of particles in a region of space, and (ii) few-mode measurements, such as phase correlators.Weshow that, in this setting, equilibration occurs quite generally despite the fact that the particles are not interacting. Furthermore, for coarse-grained measurements the timescale is generally at most polynomial in the number of particles N, which is much faster than previous general upper bounds, which were exponential in N. For local measurements on lattice systems, the timescale is typically linear in the number of lattice sites. In fact, for one-dimensional lattices, the scaling is generally linear in the length of the lattice, which is optimal. Additionally, we look at a few specific examples, one of which consists ofNfermions initially confined on one side of a partition in a box. The partition is removed and the fermions equilibrate extremely quickly in time O(1 N).
Resumo:
Developments in theory and experiment have raised the prospect of an electronic technology based on the discrete nature of electron tunnelling through a potential barrier. This thesis deals with novel design and analysis tools developed to study such systems. Possible devices include those constructed from ultrasmall normal tunnelling junctions. These exhibit charging effects including the Coulomb blockade and correlated electron tunnelling. They allow transistor-like control of the transfer of single carriers, and present the prospect of digital systems operating at the information theoretic limit. As such, they are often referred to as single electronic devices. Single electronic devices exhibit self quantising logic and good structural tolerance. Their speed, immunity to thermal noise, and operating voltage all scale beneficially with junction capacitance. For ultrasmall junctions the possibility of room temperature operation at sub picosecond timescales seems feasible. However, they are sensitive to external charge; whether from trapping-detrapping events, externally gated potentials, or system cross-talk. Quantum effects such as charge macroscopic quantum tunnelling may degrade performance. Finally, any practical system will be complex and spatially extended (amplifying the above problems), and prone to fabrication imperfection. This summarises why new design and analysis tools are required. Simulation tools are developed, concentrating on the basic building blocks of single electronic systems; the tunnelling junction array and gated turnstile device. Three main points are considered: the best method of estimating capacitance values from physical system geometry; the mathematical model which should represent electron tunnelling based on this data; application of this model to the investigation of single electronic systems. (DXN004909)
Resumo:
Automation technologies are widely acclaimed to have the potential to significantly reduce energy consumption and energy-related costs in buildings. However, despite the abundance of commercially available technologies, automation in domestic environments keep on meeting commercial failures. The main reason for this is the development process that is used to build the automation applications, which tend to focus more on technical aspects rather than on the needs and limitations of the users. An instance of this problem is the complex and poorly designed home automation front-ends that deter customers from investing in a home automation product. On the other hand, developing a usable and interactive interface is a complicated task for developers due to the multidisciplinary challenges that need to be identified and solved. In this context, the current research work investigates the different design problems associated with developing a home automation interface as well as the existing design solutions that are applied to these problems. The Qualitative Data Analysis approach was used for collecting data from research papers and the open coding process was used to cluster the findings. From the analysis of the data collected, requirements for designing the interface were derived. A home energy management functionality for a Web-based home automation front-end was developed as a proof-of-concept and a user evaluation was used to assess the usability of the interface. The results of the evaluation showed that this holistic approach to designing interfaces improved its usability which increases the chances of its commercial success.
Resumo:
POSTDATA is a 5 year's European Research Council (ERC) Starting Grant Project that started in May 2016 and is hosted by the Universidad Nacional de Educación a Distancia (UNED), Madrid, Spain. The context of the project is the corpora of European Poetry (EP), with a special focus on poetic materials from different languages and literary traditions. POSTDATA aims to offer a standardized model in the philological field and a metadata application profile (MAP) for EP in order to build a common classification of all these poetic materials. The information of Spanish, Italian and French repertoires will be published in the Linked Open Data (LOD) ecosystem. Later we expect to extend the model to include additional corpora. There are a number of Web Based Information Systems in Europe with repertoires of poems available to human consumption but not in an appropriate condition to be accessible and reusable by the Semantic Web. These systems are not interoperable; they are in fact locked in their databases and proprietary software, not suitable to be linked in the Semantic Web. A way to make this data interoperable is to develop a MAP in order to be able to publish this data available in the LOD ecosystem, and also to publish new data that will be created and modeled based on this MAP. To create a common data model for EP is not simple since the existent data models are based on conceptualizations and terminology belonging to their own poetical traditions and each tradition has developed an idiosyncratic analytical terminology in a different and independent way for years. The result of this uncoordinated evolution is a set of varied terminologies to explain analogous metrical phenomena through the different poetic systems whose correspondences have been hardly studied – see examples in González-Blanco & Rodríguez (2014a and b). This work has to be done by domain experts before the modeling actually starts. On the other hand, the development of a MAP is a complex task though it is imperative to follow a method for this development. The last years Curado Malta & Baptista (2012, 2013a, 2013b) have been studying the development of MAP's in a Design Science Research (DSR) methodological process in order to define a method for the development of MAPs (see Curado Malta (2014)). The output of this DSR process was a first version of a method for the development of Metadata Application Profiles (Me4MAP) (paper to be published). The DSR process is now in the validation phase of the Relevance Cycle to validate Me4MAP. The development of this MAP for poetry will follow the guidelines of Me4MAP and this development will be used to do the validation of Me4MAP. The final goal of the POSTDATA project is: i) to be able to publish all the data locked in the WIS, in LOD, where any agent interested will be able to build applications over the data in order to serve final users; ii) to build a Web platform where: a) researchers, students and other final users interested in EP will be able to access poems (and their analyses) of all databases; b) researchers, students and other final users will be able to upload poems, the digitalized images of manuscripts, and fill in the information concerning the analysis of the poem, collaboratively contributing to a LOD dataset of poetry.
Resumo:
We describe the construction and characterization of a new apparatus that can produce degenerate quantum gases of strontium. The realization of degenerate gases is an important first step toward future studies of quantum magnetism. Three of the four stable isotopes of strontium have been cooled into the degenerate regime. The experiment can make nearly pure Bose-Einstein condensates containing approximately 1x10^4 atoms, for strontium-86, and approximately 4x10^5 atoms, for strontium-84. We have also created degenerate Fermi gases of strontium-87 with a reduced temperature, T/T_F of approximately 0.2. The apparatus will be able to produce Bose-Einstein condensates of strontium-88 with straightforward modifications. We also report the first experimental and theoretical results from the strontium project. We have developed a technique to accelerate the continuous loading of strontium atoms into a magnetic trap. By applying a laser addressing the 3P1 to 3S1 transition in our magneto-optical trap, the rate at which atoms populate the magnetically-trapped 3P2 state can be increased by up to 65%. Quantum degenerate gases of atoms in the metastable 3P0 and 3P2 states are a promising platform for quantum simulation of systems with long-range interactions. We have performed an initial numerical study of a method to transfer the ground state degenerate gases that we can currently produce into one of the metastable states via a three-photon transition. Numerical simulations of the Optical Bloch equations governing the three-photon transition indicate that >90% of a ground state degenerate gas can be transferred into a metastable state.
Probing the interactions between ionic liquids and water: experimental and quantum chemical approach
Resumo:
For an adequate choice or design of ionic liquids, the knowledge of their interaction with other solutes and solvents is an essential feature for predicting the reactivity and selectivity of systems involving these compounds. In this work, the activity coefficient of water in several imidazolium-based ionic liquids with the common cation 1-butyl-3-methylimidazolium was measured at 298.2 K. To contribute to a deeper insight into the interaction between ionic liquids and water, COSMO-RS was used to predict the activity coefficient of water in the studied ionic liquids along with the excess enthalpies. The results showed good agreement between experimental and predicted activity coefficient of water in ionic liquids and that the interaction of water and ionic liquids was strongly influenced by the hydrogen bonding of the anion with water. Accordingly, the intensity of interaction of the anions with water can be ranked as the following: [CF3SO3](-) < [SCN](-) < [TFA](-) < Br(-) < [TOS](-) < Cl(-) < [CH3SO3](-) [DMP](-) < [Ac](-). In addition, fluorination and aromatization of anions are shown to reduce their interaction with water. The effect of temperature on the activity coefficient of water at infinite dilution was measured by inverse gas chromatography and predicted by COSMO-RS. Further analysis based on COSMO-RS provided information on the nature of hydrogen bonding between water and anion as well as the possibility of anion-water complex formation.
Resumo:
The development of robots has shown itself as a very complex interdisciplinary research field. The predominant procedure for these developments in the last decades is based on the assumption that each robot is a fully personalized project, with the direct embedding of hardware and software technologies in robot parts with no level of abstraction. Although this methodology has brought countless benefits to the robotics research, on the other hand, it has imposed major drawbacks: (i) the difficulty to reuse hardware and software parts in new robots or new versions; (ii) the difficulty to compare performance of different robots parts; and (iii) the difficulty to adapt development needs-in hardware and software levels-to local groups expertise. Large advances might be reached, for example, if physical parts of a robot could be reused in a different robot constructed with other technologies by other researcher or group. This paper proposes a framework for robots, TORP (The Open Robot Project), that aims to put forward a standardization in all dimensions (electrical, mechanical and computational) of a robot shared development model. This architecture is based on the dissociation between the robot and its parts, and between the robot parts and their technologies. In this paper, the first specification for a TORP family and the first humanoid robot constructed following the TORP specification set are presented, as well as the advances proposed for their improvement.