16 resultados para special-purpose functionalized conjugated polymers
Resumo:
Conjugated polymers have attracted considerable attention in the last few decades due to their potential for optoelectronic applications. A key step that needs optimisation is charge carrier separation following photoexcitation. To understand better the dynamics of the exciton prior to charge separation, we have performed simulations of the formation and dynamics of localised excitations in single conjugated polymer strands. We use a nonadiabatic molecular dynamics method which allows for the coupled evolution of the nuclear degrees of freedom and of multiconfigurational electronic wavefunctions. We show the relaxation of electron-hole pairs to form excitons and oppositely charged polaron pairs and discuss the modifications to the relaxation process predicted by the inclusion of the Coulomb interaction between the carriers. The issue of charge photogeneration in conjugated polymers in dilute solution is also addressed. (C) 2011 American Institute of Physics. [doi: 10.1063/1.3600404]
Resumo:
Here we survey the theory and applications of a family of methods (correlated electron-ion dynamics, or CEID) that can be applied to a diverse range of problems involving the non-adiabatic exchange of energy between electrons and nuclei. The simplest method, which is a paradigm for the others, is Ehrenfest Dynamics. This is applied to radiation damage in metals and the evolution of excited states in conjugated polymers. It is unable to reproduce the correct heating of nuclei by current carrying electrons, so we introduce a moment expansion that allows us to restore the spontaneous emission of phonons. Because of the widespread use of Non-Equilibrium Green's Functions for computing electric currents in nanoscale systems, we present a comparison of this formalism with that of CEID with open boundaries. When there is strong coupling between electrons and nuclei, the moment expansion does not converge. We thus conclude with a reworking of the CEID formalism that converges systematically and in a stable manner.
Resumo:
Public private partnerships (PPP) are an established model for most governments internationally to provide infrastructure-based services, using private finance. Typically the public authority will sign a contract with a special purpose vehicle (SPV), which, because of the holistic nature of PPP, in turn sub-contracts the finance, design, construction, maintenance and soft services to companies that are often related to its shareholders. Thus there is a considerable network of linked organisations that together procure and provide the PPP project. While there is an increasing body of research that examines these PPP projects, much of it is interview or case study based so that the evidence is drawn from a small number of interviews or cases in specific sectors. It also focuses on the public sector procurer and the private sector contractor in the network of organisations. Although it has been recognised that the perceptions of the financiers may vary from those of other key PPP players there is much less research that focuses on the financiers. In this paper we report the results of a postal questionnaire survey, administered to 109 providers of senior debt and equity, from which the response rate was just less than 40%. We supplement these findings with a small number of illustrative quotes from interviewees, where the cited quote represents a commonly held view. We used SPSS and Nvivo to analyse the data. The findings show that when assessing PPPs financiers perceive a very wide range of risks as important, and that it is important to them that many of these risks are either insured or allocated to sub-contractors. When considering participating in PPPs, financiers agree that working with familiar partners on familiar projects and in familiar sectors is important, which may raise barriers to entry and undermine competitive processes.
Resumo:
Quantum coherence between electron and ion dynamics, observed in organic semiconductors by means of ultrafast spectroscopy, is the object of recent theoretical and computational studies. To simulate this kind of quantum coherent dynamics, we have introduced in a previous article [L. Stella, M. Meister, A. J. Fisher, and A. P. Horsfield, J. Chem. Phys. 127, 214104 (2007)] an improved computational scheme based on Correlated Electron-Ion Dynamics (CEID). In this article, we provide a generalization of that scheme to model several ionic degrees of freedom and many-body electronic states. To illustrate the capability of this extended CEID, we study a model system which displays the electron-ion analog of the Rabi oscillations. Finally, we discuss convergence and scaling properties of the extended CEID along with its applicability to more realistic problems. (C) 2011 American Institute of Physics. [doi: 10.1063/1.3589165]
Resumo:
Hardware designers and engineers typically need to explore a multi-parametric design space in order to find the best configuration for their designs using simulations that can take weeks to months to complete. For example, designers of special purpose chips need to explore parameters such as the optimal bitwidth and data representation. This is the case for the development of complex algorithms such as Low-Density Parity-Check (LDPC) decoders used in modern communication systems. Currently, high-performance computing offers a wide set of acceleration options, that range from multicore CPUs to graphics processing units (GPUs) and FPGAs. Depending on the simulation requirements, the ideal architecture to use can vary. In this paper we propose a new design flow based on OpenCL, a unified multiplatform programming model, which accelerates LDPC decoding simulations, thereby significantly reducing architectural exploration and design time. OpenCL-based parallel kernels are used without modifications or code tuning on multicore CPUs, GPUs and FPGAs. We use SOpenCL (Silicon to OpenCL), a tool that automatically converts OpenCL kernels to RTL for mapping the simulations into FPGAs. To the best of our knowledge, this is the first time that a single, unmodified OpenCL code is used to target those three different platforms. We show that, depending on the design parameters to be explored in the simulation, on the dimension and phase of the design, the GPU or the FPGA may suit different purposes more conveniently, providing different acceleration factors. For example, although simulations can typically execute more than 3x faster on FPGAs than on GPUs, the overhead of circuit synthesis often outweighs the benefits of FPGA-accelerated execution.
Resumo:
The cell-specific delivery of polynucleic acids (e.g., DNA, RNA), gene therapy, has the potential to treat various diseases. In this chapter we discuss the use of organic electronic materials as non-viral gene delivery vectors and the great potential for electrochemically triggered gene delivery. We highlight some examples in this chapter based on fullerenes (bucky balls and carbon nanotubes), graphenes and electroactive polymers, particularly those that include experiments in vivo.
Resumo:
The design cycle for complex special-purpose computing systems is extremely costly and time-consuming. It involves a multiparametric design space exploration for optimization, followed by design verification. Designers of special purpose VLSI implementations often need to explore parameters, such as optimal bitwidth and data representation, through time-consuming Monte Carlo simulations. A prominent example of this simulation-based exploration process is the design of decoders for error correcting systems, such as the Low-Density Parity-Check (LDPC) codes adopted by modern communication standards, which involves thousands of Monte Carlo runs for each design point. Currently, high-performance computing offers a wide set of acceleration options that range from multicore CPUs to Graphics Processing Units (GPUs) and Field Programmable Gate Arrays (FPGAs). The exploitation of diverse target architectures is typically associated with developing multiple code versions, often using distinct programming paradigms. In this context, we evaluate the concept of retargeting a single OpenCL program to multiple platforms, thereby significantly reducing design time. A single OpenCL-based parallel kernel is used without modifications or code tuning on multicore CPUs, GPUs, and FPGAs. We use SOpenCL (Silicon to OpenCL), a tool that automatically converts OpenCL kernels to RTL in order to introduce FPGAs as a potential platform to efficiently execute simulations coded in OpenCL. We use LDPC decoding simulations as a case study. Experimental results were obtained by testing a variety of regular and irregular LDPC codes that range from short/medium (e.g., 8,000 bit) to long length (e.g., 64,800 bit) DVB-S2 codes. We observe that, depending on the design parameters to be simulated, on the dimension and phase of the design, the GPU or FPGA may suit different purposes more conveniently, thus providing different acceleration factors over conventional multicore CPUs.
Resumo:
Institutions involved in the provision of tertiary education across Europe are feeling the pinch. European universities, and other higher education (HE) institutions, must operate in a climate where the pressure of government spending cuts (Garben, 2012) is in stark juxtaposition to the EU’s strategy to drive forward and maintain a growth of student numbers in the sector (eurostat, 2015).
In order to remain competitive, universities and HE institutions are making ever-greater use of electronic assessment (E-Assessment) systems (Chatzigavriil et all, 2015; Ferrell, 2012). These systems are attractive primarily because they offer a cost-effect and scalable approach for assessment. In addition to scalability, they also offer reliability, consistency and impartiality; furthermore, from the perspective of a student they are most popular because they can offer instant feedback (Walet, 2012).
There are disadvantages, though.
First, feedback is often returned to a student immediately on competition of their assessment. While it is possible to disable the instant feedback option (this is often the case during an end of semester exam period when assessment scores must be can be ratified before release), however, this option tends to be a global ‘all on’ or ‘all off’ configuration option which is controlled centrally rather than configurable on a per-assessment basis.
If a formative in-term assessment is to be taken by multiple groups of
students, each at different times, this restriction means that answers to each question will be disclosed to the first group of students undertaking the assessment. As soon as the answers are released “into the wild” the academic integrity of the assessment is lost for subsequent student groups.
Second, the style of feedback provided to a student for each question is often limited to a simple ‘correct’ or ‘incorrect’ indicator. While this type of feedback has its place, it often does not provide a student with enough insight to improve their understanding of a topic that they did not answer correctly.
Most E-Assessment systems boast a wide range of question types including Multiple Choice, Multiple Response, Free Text Entry/Text Matching and Numerical questions. The design of these types of questions is often quite restrictive and formulaic, which has a knock-on effect on the quality of feedback that can be provided in each case.
Multiple Choice Questions (MCQs) are most prevalent as they are the most prescriptive and therefore most the straightforward to mark consistently. They are also the most amenable question types, which allow easy provision of meaningful, relevant feedback to each possible outcome chosen.
Text matching questions tend to be more problematic due to their free text entry nature. Common misspellings or case-sensitivity errors can often be accounted for by the software but they are by no means fool proof, as it is very difficult to predict in advance the range of possible variations on an answer that would be considered worthy of marks by a manual marker of a paper based equivalent of the same question.
Numerical questions are similarly restricted. An answer can be checked for accuracy or whether it is within a certain range of the correct answer, but unless it is a special purpose-built mathematical E-Assessment system the system is unlikely to have computational capability and so cannot, for example, account for “method marks” which are commonly awarded in paper-based marking.
From a pedagogical perspective, the importance of providing useful formative feedback to students at a point in their learning when they can benefit from the feedback and put it to use must not be understated (Grieve et all, 2015; Ferrell, 2012).
In this work, we propose a number of software-based solutions, which will overcome the limitations and inflexibilities of existing E-Assessment systems.
Resumo:
Reliability has emerged as a critical design constraint especially in memories. Designers are going to great lengths to guarantee fault free operation of the underlying silicon by adopting redundancy-based techniques, which essentially try to detect and correct every single error. However, such techniques come at a cost of large area, power and performance overheads which making many researchers to doubt their efficiency especially for error resilient systems where 100% accuracy is not always required. In this paper, we present an alternative method focusing on the confinement of the resulting output error induced by any reliability issues. By focusing on memory faults, rather than correcting every single error the proposed method exploits the statistical characteristics of any target application and replaces any erroneous data with the best available estimate of that data. To realize the proposed method a RISC processor is augmented with custom instructions and special-purpose functional units. We apply the method on the proposed enhanced processor by studying the statistical characteristics of the various algorithms involved in a popular multimedia application. Our experimental results show that in contrast to state-of-the-art fault tolerance approaches, we are able to reduce runtime and area overhead by 71.3% and 83.3% respectively.
Resumo:
The technical challenges in the design and programming of signal processors for multimedia communication are discussed. The development of terminal equipment to meet such demand presents a significant technical challenge, considering that it is highly desirable that the equipment be cost effective, power efficient, versatile, and extensible for future upgrades. The main challenges in the design and programming of signal processors for multimedia communication are, general-purpose signal processor design, application-specific signal processor design, operating systems and programming support and application programming. The size of FFT is programmable so that it can be used for various OFDM-based communication systems, such as digital audio broadcasting (DAB), digital video broadcasting-terrestrial (DVB-T) and digital video broadcasting-handheld (DVB-H). The clustered architecture design and distributed ping-pong register files in the PAC DSP raise new challenges of code generation.
Resumo:
Colloidal gold nanoparticles (AuNPs) and precipitation of an insoluble product formed by HRP-biocatalyzed oxidation of 3,3'-diaminobenzidine (DAB) in the presence of H2O2 were used to enhance the signal obtained from the surface plasmon resonance (SPR) biosensor. The AuNPs were synthesized and functionalized with HS-OEG(3)-COOH by self assembling technique. Thereafter, the HS-OEG3-COOH functionalized nanoparticles were covalently conjugated with horseradish peroxidase (HRP) and anti IgG antibody to form an enzyme-immunogold complex. Characterizations were performed by several methods: UV-vis absorption, DLS, HR-TEM and Fr-IR. The Au-anti IgG-HRP complex has been applied in enhancement of SPR immunoassay using a sensor chip constructed by 1:9 molar ratio of HS-OEG(6)-COOH and HS-OEG(3)-OH for detection of anti-GAD antibody. As a result, AuNPs showed their enhancement as being consistent with other previous studies while the enzyme precipitation using DAB substrate was applied for the first time and greatly amplified the SPR detection. The limit of detection was found as low as 0.03 ng/ml of anti-GAD antibody (or 200 fM) which is much higher than that of previous reports. This study indicates another way to enhance SPR measurement, and it is generally applicable to other SPR-based immunoassays.
Resumo:
Biodegradable amphiphilic diblock copolymers based on an aliphatic ester block and various hydrophilic methacrylic monomers were synthesized using a novel hydroxyl-functionalized trithiocarbonate-based chain transfer agent. One protocol involved the one-pot simultaneous ring-opening polymerization (ROP) of the biodegradable monomer (3S)-cis-3,6-dimethyl-1,4-dioxane-2,5-dione (L-lactide, LA) and reversible addition–fragmentation chain transfer (RAFT) polymerization of 2-(dimethylamino)ethyl methacrylate (DMA) or oligo(ethylene glycol) methacrylate (OEGMA) monomer, with 4-dimethylaminopyridine being used as the ROP catalyst and 2,2′-azobis(isobutyronitrile) as the initiator for the RAFT polymerization. Alternatively, a two-step protocol involving the initial polymerization of LA followed by the polymerization of DMA, glycerol monomethacrylate or 2-(methacryloyloxy)ethyl phosphorylcholine using 4,4′-azobis(4-cyanovaleric acid) as a RAFT initiator was also explored. Using a solvent switch processing step, these amphiphilic diblock copolymers self-assemble in dilute aqueous solution. Their self-assembly provides various copolymer morphologies depending on the block compositions, as judged by transmission electron microscopy and dynamic light scattering. Two novel disulfide-functionalized PLA-branched block copolymers were also synthesized using simultaneous ROP of LA and RAFT copolymerization of OEGMA or DMA with a disulfide-based dimethacrylate. The disulfide bonds were reductively cleaved using tributyl phosphine to generate reactive thiol groups. Thiol–ene chemistry was utilized for further derivatization with thiol-based biologically important molecules and heavy metals for tissue engineering or bioimaging applications, respectively.
Resumo:
Geo-synthetic reinforcements are often used to enhance the stability of geotechnical structures such as embankments. These geosynthetic polymers often show significant creep deformational behaviour. In the short-term performance of a geotechnical structure, it may not play a significant role. However, while dealing with the long term behaviour, it is necessary to investigate its effect. In this paper two plane strain fully coupled finite element analysis have been conducted; one with and the other without taking into account of the creep behaviour of geosynthetics. A well documented field case of Leneghans embankment (Geogrid improved wide embankment constructed near Sydney, Australia in 1990s) have been used for this purpose. It is evident from the analyses that though the geosynthetic reinforcements may play a vital role in the performance/stability of an embankment in the early days (during and after construction), its contribution may become insignificant with time and the creep of geo-synthetic may not play a significant role in the long term stability. © 2012 American Society of Civil Engineers.
Resumo:
A ditopic ligand (1), containing two tridentate bis(acylhydrazone) subunits and bearing both long alkyl chains and hydrogen-bonding groups, has been synthesised. Metal cation binding in the presence of a base leads to hierarchical self-assembly, forming first a neutral [2 x 2] grid-type complex (2) that hierarchically assembles into metallosupramolecular polymer gels in toluene.