951 resultados para entangled polymer solution theory
Resumo:
Large (10 × 10 cm) sheets of surface-enhanced Raman spectroscopy (SERS) active polymer have been prepared by stabilising metal nanoparticle aggregates within dry hydroxyethylcellulose (HEC) films. In these films the aggregates are protected by the polymer matrix during storage but in use they are released when aqueous analyte droplets cause the films to swell to their gel form. The fact that these "Poly-SERS" films can be prepared in bulk but then cut to size and stored in air before use means that they provide a cost effective and convenient method for routine SERS analysis. Here we have tested both Ag and Au Poly-SERS films for use in point-of-care monitoring of therapeutic drugs, using phenytoin as the test compound. Phenytoin in water could readily be detected using Ag Poly-SERS films but dissolving the compound in phosphate buffered saline (PBS) to mimic body fluid samples caused loss of the drug signal due to competition for metal surface sites from Cl- ions in the buffer solution. However, with Au Poly-SERS films there was no detectable interference from Cl- and these materials allowed phenytoin to be detected at 1.8 mg L-1, even in PBS. The target range of detection of phenytoin in therapeutic drug monitoring is 10-20 mg L-1. With the Au Poly-SERS films, the absolute signal generated by a given concentration of phenytoin was lower for the films than for the parent colloid but the SERS signals were still high enough to be used for therapeutic monitoring, so the cost in sensitivity for moving from simple aqueous colloids to films is not so large that it outweighs the advantages which the films bring for practical applications, in particular their ease of use and long shelf life.
Resumo:
Queueing theory is the mathematical study of ‘queue’ or ‘waiting lines’ where an item from inventory is provided to the customer on completion of service. A typical queueing system consists of a queue and a server. Customers arrive in the system from outside and join the queue in a certain way. The server picks up customers and serves them according to certain service discipline. Customers leave the system immediately after their service is completed. For queueing systems, queue length, waiting time and busy period are of primary interest to applications. The theory permits the derivation and calculation of several performance measures including the average waiting time in the queue or the system, mean queue length, traffic intensity, the expected number waiting or receiving service, mean busy period, distribution of queue length, and the probability of encountering the system in certain states, such as empty, full, having an available server or having to wait a certain time to be served.
Resumo:
The aim of this dissertation was to investigate flexible polymer-nanoparticle composites with unique magnetic and electrical properties. Toward this goal, two distinct projects were carried out. The first project explored the magneto-dielectric properties and morphology of flexible polymer-nanoparticle composites that possess high permeability (µ), high permittivity (ε) and minimal dielectric, and magnetic loss (tan δε, tan δµ). The main materials challenges were the synthesis of magnetic nanoparticle fillers displaying high saturation magnetization (Ms), limited coercivity, and their homogeneous dispersion in a polymeric matrix. Nanostructured magnetic fillers including polycrystalline iron core-shell nanoparticles, and constructively assembled superparamagnetic iron oxide nanoparticles were synthesized, and dispersed uniformly in an elastomer matrix to minimize conductive losses. The resulting composites have demonstrated promising permittivity (22.3), permeability (3), and sustained low dielectric (0.1), magnetic (0.4) loss for frequencies below 2 GHz. This study demonstrated nanocomposites with tunable magnetic resonance frequency, which can be used to develop compact and flexible radio frequency devices with high efficiency. The second project focused on fundamental research regarding methods for the design of highly conductive polymer-nanoparticle composites that can maintain high electrical conductivity under tensile strain exceeding 100%. We investigated a simple solution spraying method to fabricate stretchable conductors based on elastomeric block copolymer fibers and silver nanoparticles. Silver nanoparticles were assembled both in and around block copolymer fibers forming interconnected dual nanoparticle networks, resulting in both in-fiber conductive pathways and additional conductive pathways on the outer surface of the fibers. Stretchable composites with conductivity values reaching 9000 S/cm maintained 56% of their initial conductivity after 500 cycles at 100% strain. The developed manufacturing method in this research could pave the way towards direct deposition of flexible electronic devices on any shaped substrate. The electrical and electromechanical properties of these dual silver nanoparticle network composites make them promising materials for the future construction of stretchable circuitry for displays, solar cells, antennas, and strain and tactility sensors.
Resumo:
Value and reasons for action are often cited by rationalists and moral realists as providing a desire-independent foundation for normativity. Those maintaining instead that normativity is dependent upon motivation often deny that anything called '"value" or "reasons" exists. According to the interest-relational theory, something has value relative to some perspective of desire just in case it satisfies those desires, and a consideration is a reason for some action just in case it indicates that something of value will be accomplished by that action. Value judgements therefore describe real properties of objects and actions, but have no normative significance independent of desires. It is argued that only the interest-relational theory can account for the practical significance of value and reasons for action. Against the Kantian hypothesis of prescriptive rational norms, I attack the alleged instrumental norm or hypothetical imperative, showing that the normative force for taking the means to our ends is explicable in terms of our desire for the end, and not as a command of reason. This analysis also provides a solution to the puzzle concerning the connection between value judgement and motivation. While it is possible to hold value judgements without motivation, the connection is more than accidental. This is because value judgements are usually but not always made from the perspective of desires that actually motivate the speaker. In the normal case judgement entails motivation. But often we conversationally borrow external perspectives of desire, and subsequent judgements do not entail motivation. This analysis drives a critique of a common practice as a misuse of normative language. The "absolutist" attempts to use and, as philosopher, analyze normative language in such a way as to justify the imposition of certain interests over others. But these uses and analyses are incoherent - in denying relativity to particular desires they conflict with the actual meaning of these utterances, which is always indexed to some particular set of desires.
Resumo:
Abstract We present ideas about creating a next generation Intrusion Detection System (IDS) based on the latest immunological theories. The central challenge with computer security is determining the difference between normal and potentially harmful activity. For half a century, developers have protected their systems by coding rules that identify and block specific events. However, the nature of current and future threats in conjunction with ever larger IT systems urgently requires the development of automated and adaptive defensive tools. A promising solution is emerging in the form of Artificial Immune Systems (AIS): The Human Immune System (HIS) can detect and defend against harmful and previously unseen invaders, so can we not build a similar Intrusion Detection System (IDS) for our computers? Presumably, those systems would then have the same beneficial properties as HIS like error tolerance, adaptation and self-monitoring. Current AIS have been successful on test systems, but the algorithms rely on self-nonself discrimination, as stipulated in classical immunology. However, immunologist are increasingly finding fault with traditional self-nonself thinking and a new 'Danger Theory' (DT) is emerging. This new theory suggests that the immune system reacts to threats based on the correlation of various (danger) signals and it provides a method of 'grounding' the immune response, i.e. linking it directly to the attacker. Little is currently understood of the precise nature and correlation of these signals and the theory is a topic of hot debate. It is the aim of this research to investigate this correlation and to translate the DT into the realms of computer security, thereby creating AIS that are no longer limited by self-nonself discrimination. It should be noted that we do not intend to defend this controversial theory per se, although as a deliverable this project will add to the body of knowledge in this area. Rather we are interested in its merits for scaling up AIS applications by overcoming self-nonself discrimination problems.
Resumo:
We construct parent Hamiltonians involving only local 2-body interactions for a broad class of projected entangled pair states (PEPS). Making use of perturbation gadget techniques, we define a perturbative Hamiltonian acting on the virtual PEPS space with a finite order low energy effective Hamiltonian that is a gapped, frustration-free parent Hamiltonian for an encoded version of a desired PEPS. For topologically ordered PEPS, the ground space of the low energy effective Hamiltonian is shown to be in the same phase as the desired state to all orders of perturbation theory. An encoded parent Hamiltonian for the double semion string net ground state is explicitly constructed as a concrete example.
Resumo:
A systematic diagrammatic expansion for Gutzwiller wavefunctions (DE-GWFs) proposed very recently is used for the description of the superconducting (SC) ground state in the two-dimensional square-lattice t-J model with the hopping electron amplitudes t (and t') between nearest (and next-nearest) neighbors. For the example of the SC state analysis we provide a detailed comparison of the method's results with those of other approaches. Namely, (i) the truncated DE-GWF method reproduces the variational Monte Carlo (VMC) results and (ii) in the lowest (zeroth) order of the expansion the method can reproduce the analytical results of the standard Gutzwiller approximation (GA), as well as of the recently proposed 'grand-canonical Gutzwiller approximation' (called either GCGA or SGA). We obtain important features of the SC state. First, the SC gap at the Fermi surface resembles a d(x2-y2) wave only for optimally and overdoped systems, being diminished in the antinodal regions for the underdoped case in a qualitative agreement with experiment. Corrections to the gap structure are shown to arise from the longer range of the real-space pairing. Second, the nodal Fermi velocity is almost constant as a function of doping and agrees semi-quantitatively with experimental results. Third, we compare the
Resumo:
Abstract We present ideas about creating a next generation Intrusion Detection System (IDS) based on the latest immunological theories. The central challenge with computer security is determining the difference between normal and potentially harmful activity. For half a century, developers have protected their systems by coding rules that identify and block specific events. However, the nature of current and future threats in conjunction with ever larger IT systems urgently requires the development of automated and adaptive defensive tools. A promising solution is emerging in the form of Artificial Immune Systems (AIS): The Human Immune System (HIS) can detect and defend against harmful and previously unseen invaders, so can we not build a similar Intrusion Detection System (IDS) for our computers? Presumably, those systems would then have the same beneficial properties as HIS like error tolerance, adaptation and self-monitoring. Current AIS have been successful on test systems, but the algorithms rely on self-nonself discrimination, as stipulated in classical immunology. However, immunologist are increasingly finding fault with traditional self-nonself thinking and a new 'Danger Theory' (DT) is emerging. This new theory suggests that the immune system reacts to threats based on the correlation of various (danger) signals and it provides a method of 'grounding' the immune response, i.e. linking it directly to the attacker. Little is currently understood of the precise nature and correlation of these signals and the theory is a topic of hot debate. It is the aim of this research to investigate this correlation and to translate the DT into the realms of computer security, thereby creating AIS that are no longer limited by self-nonself discrimination. It should be noted that we do not intend to defend this controversial theory per se, although as a deliverable this project will add to the body of knowledge in this area. Rather we are interested in its merits for scaling up AIS applications by overcoming self-nonself discrimination problems.
Resumo:
We review our work on generalisations of the Becker-Doring model of cluster-formation as applied to nucleation theory, polymer growth kinetics, and the formation of upramolecular structures in colloidal chemistry. One valuable tool in analysing mathematical models of these systems has been the coarse-graining approximation which enables macroscopic models for observable quantities to be derived from microscopic ones. This permits assumptions about the detailed molecular mechanisms to be tested, and their influence on the large-scale kinetics of surfactant self-assembly to be elucidated. We also summarise our more recent results on Becker-Doring systems, notably demonstrating that cross-inhibition and autocatalysis can destabilise a uniform solution and lead to a competitive environment in which some species flourish at the expense of others, phenomena relevant in models of the origins of life.
Resumo:
We present ideas about creating a next generation Intrusion Detection System (IDS) based on the latest immunological theories. The central challenge with computer security is determining the difference between normal and potentially harmful activity. For half a century, developers have protected their systems by coding rules that identify and block specific events. However, the nature of current and future threats in conjunction with ever larger IT systems urgently requires the development of automated and adaptive defensive tools. A promising solution is emerging in the form of Artificial Immune Systems (AIS): The Human Immune System (HIS) can detect and defend against harmful and previously unseen invaders, so can we not build a similar Intrusion Detection System (IDS) for our computers? Presumably, those systems would then have the same beneficial properties as HIS like error tolerance, adaptation and self-monitoring. Current AIS have been successful on test systems, but the algorithms rely on self-nonself discrimination, as stipulated in classical immunology. However, immunologist are increasingly finding fault with traditional self-nonself thinking and a new ‘Danger Theory’ (DT) is emerging. This new theory suggests that the immune system reacts to threats based on the correlation of various (danger) signals and it provides a method of ‘grounding’ the immune response, i.e. linking it directly to the attacker. Little is currently understood of the precise nature and correlation of these signals and the theory is a topic of hot debate. It is the aim of this research to investigate this correlation and to translate the DT into the realms of computer security, thereby creating AIS that are no longer limited by self-nonself discrimination. It should be noted that we do not intend to defend this controversial theory per se, although as a deliverable this project will add to the body of knowledge in this area. Rather we are interested in its merits for scaling up AIS applications by overcoming self-nonself discrimination problems.
Resumo:
Portland-polymers composites are promising candidates to be used as cementing material in Northeastern oil wells of Brazil containing heavy oils submitted to steam injection. In this way, it is necessary to evaluate its degradation in the commonly acidizind agents. In addition, to identify how aggressive are the different hostile environments it is an important contribution on the decision of the acidic systems to be used in. It was investigated the performance of the Portland-polymer composites using powdered polyurethane, aqueous polyurethane, rubber tire residues and a biopolymer, those were reinforced with polished carbon steel SAE 1045 to make the electrochemical measurements. HCl 15,0 %, HCl 6,0 % + HF 1,5 % (soft mud acid), HCl 12,0 % + HF 3,0 % (regular mud acid) and HAc 10 % + HF 1,5 % were used as degrading environment and electrolytes. The more aggressive acid solution to the plain Portland hardened cement paste was the regular mud acid, that showed loss of weight around 23.0 %, followed by the soft mud acid, the showed 11.0 %, 15.0 % HCl with 7,0 % and, at last the 10.0 % HAc plus HF 1.5 % with just 1.0 %. The powdered polyurethane-composite and the aqueous polyurethane one showed larger durability, with reduction around 87.0 % on the loss of weight in regular mud acid. The acid attack is superficial and it occurs as an action layer, where the degraded layer is responsible for the decrease on the kinetic of the degrading process. This behavior can be seen mainly on the Portland- aqueous polyurethane composite, because the degraded layer is impregnated with chemically modified polymer. The fact of the acid attack does not have influence on the compressive strength or fratography of the samples, in a general way, confirms that theory. The mechanism of the efficiency of the Portland-polymers composites subjected to acid attack is due to decreased porosity and permeability related with the plain Portland paste, minor quantity of Ca+2, element preferentially leached to the acidic solution, wave effect and to substitute part of the degrading bulk for the polymeric one. The electrolyte HAc 10 % + HF 1,5 % was the least aggressive one to the external corrosion of the casing, showing open circuit potentials around +250 mV compared to -130 mV to the simulated pore solution to the first 24 hours immersion. This behavior has been performed for two months at least. Similar corrosion rates were showed between both of the electrolytes, around 0.01 μA.cm-2. Total impedance values, insipient arcs and big polarization resistance capacitive arcs on the Nyquist plots, indicating passivity process, confirm its efficiency. In this way, Portlandpolymers composites are possible solutions to be succeed applied to oilwell cementing concomitant submitted to steam injection and acidizing operation and the HAc 10,0 % + HF 1,5 % is the less aggressive solution to the external corrosion of the casing
Resumo:
306 p.
Resumo:
Metal oxide thin films are important for modern electronic devices ranging from thin film transistors to photovoltaics and functional optical coatings. Solution processed techniques allow for thin films to be rapidly deposited over a range of surfaces without the extensive processing of comparative vapour or physical deposition methods. The production of thin films of vanadium oxide prepared through dip-coating was developed enabling a greater understanding of the thin film formation. Mechanisms of depositing improved large area uniform coverage on a number of technologically relevant substrates were examined. The fundamental mechanism for polymer-assisted deposition in improving thin film surface smoothness and long range order has been delivered. Different methods were employed for adapting the alkoxide based dip-coating technique to produce a variety of amorphous and crystalline vanadium oxide based thin films. Using a wide range of material, spectroscopic and optical measurement techniques the morphology, structure and optoelectronic properties of the thin films were studied. The formation of pinholes on the surface of the thin films, due to dewetting and spinodal effects, was inhibited using the polymer assisted deposition technique. Uniform thin films with sub 50 nm thicknesses were deposited on a variety of substrates controlled through alterations to the solvent-alkoxide dilution ratios and employing polymer assisted deposition techniques. The effects of polymer assisted deposition altered the crystallized VO thin films from a granular surface structure to a polycrystalline structure composed of high density small in-plane grains. The formation of transparent VO based thin film through Si and Na substrate mediated diffusion highlighted new methods for material formation and doping.
Resumo:
Data sources are often dispersed geographically in real life applications. Finding a knowledge model may require to join all the data sources and to run a machine learning algorithm on the joint set. We present an alternative based on a Multi Agent System (MAS): an agent mines one data source in order to extract a local theory (knowledge model) and then merges it with the previous MAS theory using a knowledge fusion technique. This way, we obtain a global theory that summarizes the distributed knowledge without spending resources and time in joining data sources. New experiments have been executed including statistical significance analysis. The results show that, as a result of knowledge fusion, the accuracy of initial theories is significantly improved as well as the accuracy of the monolithic solution.