951 resultados para WIDE-RANGE CURRENT MEASUREMENT


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Renewable energy project development is highly complex and success is by no means guaranteed. Decisions are often made with approximate or uncertain information yet the current methods employed by decision-makers do not necessarily accommodate this. Levelised energy costs (LEC) are one such commonly applied measure utilised within the energy industry to assess the viability of potential projects and inform policy. The research proposes a method for achieving this by enhancing the traditional discounting LEC measure with fuzzy set theory. Furthermore, the research develops the fuzzy LEC (F-LEC) methodology to incorporate the cost of financing a project from debt and equity sources. Applied to an example bioenergy project, the research demonstrates the benefit of incorporating fuzziness for project viability, optimal capital structure and key variable sensitivity analysis decision-making. The proposed method contributes by incorporating uncertain and approximate information to the widely utilised LEC measure and by being applicable to a wide range of energy project viability decisions. © 2013 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A view has emerged within manufacturing and service organizations that the operations management function can hold the key to achieving competitive edge. This has recently been emphasized by the demands for greater variety and higher quality which must be set against a background of increasing cost of resources. As nations' trade barriers are progressively lowered and removed, so producers of goods and service products are becoming more exposed to competition that may come from virtually anywhere around the world. To simply survive in this climate many organizations have found it necessary to improve their manufacturing or service delivery systems. To become real ''winners'' some have adopted a strategic approach to operations and completely reviewed and restructured their approach to production system design and operations planning and control. The articles in this issue of the International journal of Operations & Production Management have been selected to illustrate current thinking and practice in relation to this situation. They are all based on papers presented to the Sixth International Conference of the Operations Management Association-UK which was held at Aston University in June 1991. The theme of the conference was "Achieving Competitive Edge" and authors from 15 countries around the world contributed to more than 80 presented papers. Within this special issue five topic areas are addressed with two articles relating to each. The topics are: strategic management of operations; managing change; production system design; production control; and service operations. Under strategic management of operations De Toni, Filippini and Forza propose a conceptual model which considers the performance of an operating system as a source of competitive advantage through the ''operation value chain'' of design, purchasing, production and distribution. Their model is set within the context of the tendency towards globalization. New's article is somewhat in contrast to the more fashionable literature on operations strategy. It challenges the validity of the current idea of ''world-class manufacturing'' and, instead, urges a reconsideration of the view that strategic ''trade-offs'' are necessary to achieve a competitive edge. The importance of managing change has for some time been recognized within the field of organization studies but its relevance in operations management is now being realized. Berger considers the use of "organization design", ''sociotechnical systems'' and change strategies and contrasts these with the more recent idea of the ''dialogue perspective''. A tentative model is suggested to improve the analysis of different strategies in a situation specific context. Neely and Wilson look at an essential prerequisite if change is to be effected in an efficient way, namely product goal congruence. Using a case study as its basis, their article suggests a method of measuring goal congruence as a means of identifying the extent to which key performance criteria relating to quality, time, cost and flexibility are understood within an organization. The two articles on production systems design represent important contributions to the debate on flexible production organization and autonomous group working. Rosander uses the results from cases to test the applicability of ''flow groups'' as the optimal way of organizing batch production. Schuring also examines cases to determine the reasons behind the adoption of ''autonomous work groups'' in The Netherlands and Sweden. Both these contributions help to provide a greater understanding of the production philosophies which have emerged as alternatives to more conventional systems -------for intermittent and continuous production. The production control articles are both concerned with the concepts of ''push'' and ''pull'' which are the two broad approaches to material planning and control. Hirakawa, Hoshino and Katayama have developed a hybrid model, suitable for multistage manufacturing processes, which combines the benefits of both systems. They discuss the theoretical arguments in support of the system and illustrate its performance with numerical studies. Slack and Correa's concern is with the flexibility characteristics of push and pull material planning and control systems. They use the case of two plants using the different systems to compare their performance within a number of predefined flexibility types. The two final contributions on service operations are complementary. The article by Voss really relates to manufacturing but examines the application of service industry concepts within the UK manufacturing sector. His studies in a number of companies support the idea of the ''service factory'' and offer a new perspective for manufacturing. Harvey's contribution by contrast, is concerned with the application of operations management principles in the delivery of professional services. Using the case of social-service provision in Canada, it demonstrates how concepts such as ''just-in-time'' can be used to improve service performance. The ten articles in this special issue of the journal address a wide range of issues and situations. Their common aspect is that, together, they demonstrate the extent to which competitiveness can be improved via the application of operations management concepts and techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent research in literacy acquisition has generated detailed programs for teaching phonological awareness. The current paper will address three issues that follow from this research. Firstly, much of the past research has been conducted under conditions that are divorced from the classroom. As a result, it is not known whether the suggested teaching strategies will lead to an increase in children’s attainments when integrated into a broad reading curriculum implemented by teachers in mainstream classrooms. Secondly, these phonological interventions have been designed either to prevent the occurrence of reading difficulties or to meet the needs of failing readers. Therefore, it is not known whether the same methods would advantage all children. Thirdly, teaching children to read takes a minimum of two to three academic years. We herefore need to develop a reading curriculum that can provide the progression and differentiation to meet a wide range of needs over several academic years. We report two studies that have addressed these issues through monitoring the impact of a reading curriculum, implemented by teachers, which integrated children’s acquisition of phonological skills with broader aspects of teaching reading over three academic years. The attainments of children at all levels of ability in the experimental group were raised relative to controls, and importantly, these gains were maintained after the intervention was withdrawn. These results demonstrate that phonological awareness training can be successfully integrated into real classroom contexts and that the same methods raised the attainments of normally developing children, as well as those at risk of reading failure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computational Fluid Dynamics (CFD) has found great acceptance among the engineering community as a tool for research and design of processes that are practically difficult or expensive to study experimentally. One of these processes is the biomass gasification in a Circulating Fluidized Bed (CFB). Biomass gasification is the thermo-chemical conversion of biomass at a high temperature and a controlled oxygen amount into fuel gas, also sometime referred to as syngas. Circulating fluidized bed is a type of reactor in which it is possible to maintain a stable and continuous circulation of solids in a gas-solid system. The main objectives of this thesis are four folds: (i) Develop a three-dimensional predictive model of biomass gasification in a CFB riser using advanced Computational Fluid Dynamic (CFD) (ii) Experimentally validate the developed hydrodynamic model using conventional and advanced measuring techniques (iii) Study the complex hydrodynamics, heat transfer and reaction kinetics through modelling and simulation (iv) Study the CFB gasifier performance through parametric analysis and identify the optimum operating condition to maximize the product gas quality. Two different and complimentary experimental techniques were used to validate the hydrodynamic model, namely pressure measurement and particle tracking. The pressure measurement is a very common and widely used technique in fluidized bed studies, while, particle tracking using PEPT, which was originally developed for medical imaging, is a relatively new technique in the engineering field. It is relatively expensive and only available at few research centres around the world. This study started with a simple poly-dispersed single solid phase then moved to binary solid phases. The single solid phase was used for primary validations and eliminating unnecessary options and steps in building the hydrodynamic model. Then the outcomes from the primary validations were applied to the secondary validations of the binary mixture to avoid time consuming computations. Studies on binary solid mixture hydrodynamics is rarely reported in the literature. In this study the binary solid mixture was modelled and validated using experimental data from the both techniques mentioned above. Good agreement was achieved with the both techniques. According to the general gasification steps the developed model has been separated into three main gasification stages; drying, devolatilization and tar cracking, and partial combustion and gasification. The drying was modelled as a mass transfer from the solid phase to the gas phase. The devolatilization and tar cracking model consist of two steps; the devolatilization of the biomass which is used as a single reaction to generate the biomass gases from the volatile materials and tar cracking. The latter is also modelled as one reaction to generate gases with fixed mass fractions. The first reaction was classified as a heterogeneous reaction while the second reaction was classified as homogenous reaction. The partial combustion and gasification model consisted of carbon combustion reactions and carbon and gas phase reactions. The partial combustion considered was for C, CO, H2 and CH4. The carbon gasification reactions used in this study is the Boudouard reaction with CO2, the reaction with H2O and Methanation (Methane forming reaction) reaction to generate methane. The other gas phase reactions considered in this study are the water gas shift reaction, which is modelled as a reversible reaction and the methane steam reforming reaction. The developed gasification model was validated using different experimental data from the literature and for a wide range of operating conditions. Good agreement was observed, thus confirming the capability of the model in predicting biomass gasification in a CFB to a great accuracy. The developed model has been successfully used to carry out sensitivity and parametric analysis. The sensitivity analysis included: study of the effect of inclusion of various combustion reaction; and the effect of radiation in the gasification reaction. The developed model was also used to carry out parametric analysis by changing the following gasifier operating conditions: fuel/air ratio; biomass flow rates; sand (heat carrier) temperatures; sand flow rates; sand and biomass particle sizes; gasifying agent (pure air or pure steam); pyrolysis models used; steam/biomass ratio. Finally, based on these parametric and sensitivity analysis a final model was recommended for the simulation of biomass gasification in a CFB riser.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Allergic eye disease encompasses a group of hypersensitivity disorders which primarily affect the conjunctiva and its prevalence is increasing. It is estimated to affect 8% of patients attending optometric practice but is poorly managed and rarely involves ophthalmic assessment. Seasonal allergic conjunctivitis (SAC) is the most common form of allergic eye disease (90%), followed by perennial allergic conjunctivitis (PAC; 5%). Both are type 1 IgE mediated hypersensitivity reactions where mast cells play an important role in pathophysiology. The signs and symptoms are similar but SAC occurs periodically whereas PAC occurs year round. Despite being a relatively mild condition, the effects on the quality of life can be profound and therefore they demand attention. Primary management of SAC and PAC involves avoidance strategies depending on the responsible allergen(s) to prevent the hypersensitivity reaction. Cooled tear supplements and cold compresses may help bring relief. Pharmacological agents may become necessary as it is not possible to completely avoid the allergen(s). There are a wide range of anti-allergic medications available, such as mast cell stabilisers, antihistamines and dual-action agents. Severe cases refractory to conventional treatment require anti-inflammatories, immunomodulators or immunotherapy. Additional qualifications are required to gain access to these medications, but entry-level optometrists must offer advice and supportive therapy. Based on current evidence, the efficacy of anti-allergic medications appears equivocal so prescribing should relate to patient preference, dosing and cost. More studies with standardised methodologies are necessary elicit the most effective anti-allergic medications but those with dual-actions are likely to be first line agents. © 2011 British Contact Lens Association.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data envelopment analysis (DEA) has gained a wide range of applications in measuring comparative efficiency of decision making units (DMUs) with multiple incommensurate inputs and outputs. The standard DEA method requires that the status of all input and output variables be known exactly. However, in many real applications, the status of some measures is not clearly known as inputs or outputs. These measures are referred to as flexible measures. This paper proposes a flexible slacks-based measure (FSBM) of efficiency in which each flexible measure can play input role for some DMUs and output role for others to maximize the relative efficiency of the DMU under evaluation. Further, we will show that when an operational unit is efficient in a specific flexible measure, this measure can play both input and output roles for this unit. In this case, the optimal input/output designation for flexible measure is one that optimizes the efficiency of the artificial average unit. An application in assessing UK higher education institutions used to show the applicability of the proposed approach. © 2013 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Onion (Allium cepa L.) is botanically included in the Liliaceae and species are found across a wide range of latitudes and altitudes in Europe, Asia, N. America and Africa. World onion production has increased by at least 25% over the past 10 years with current production being around 44 million tonnes making it the second most important horticultural crop after tomatoes. Because of their storage characteristics and durability for shipping, onions have always been traded more widely than most vegetables. Onions are versatile and are often used as an ingredient in many dishes and are accepted by almost all traditions and cultures. Onion consumption is increasing significantly, particularly in the USA and this is partly because of heavy promotion that links flavour and health. Onions are rich in two chemical groups that have perceived benefits to human health. These are the flavonoids and the alk(en)yl cysteine sulphoxides (ACSOs). Two flavonoid subgroups are found in onion, the anthocyanins, which impart a red/purple colour to some varieties and flavanols such as quercetin and its derivatives responsible for the yellow and brown skins of many other varieties. The ACSOs are the flavour precursors, which, when cleaved by the enzyme alliinase, generate the characteristic odour and taste of onion. The downstream products are a complex mixture of compounds which include thiosulphinates, thiosulphonates, mono-, di- and tri-sulphides. Compounds from onion have been reported to have a range of health benefits which include anticarcinogenic properties, antiplatelet activity, antithrombotic activity, antiasthmatic and antibiotic effects. Here we review the agronomy of the onion crop, the biochemistry of the health compounds and report on recent clinical data obtained using extracts from this species. Where appropriate we have compared the data with that obtained from garlic (Allium sativum L.) for which more information is widely available. Copyright © 2002 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The total structure factor of molten TbCl3 at 617ºC was measured by using neutron diffraction. The data are in agreement with results from previous experimental work but the use of a diffractometer having an extended reciprocal-space measurement window leads to improved resolution in real space. Significant discrepancies with the results obtained from recent molecular dynamics simulations carried out using a polarizable ion model, in which the interaction potentials were optimized to enhance agreement with previous diffraction data, are thereby highlighted. It is hence shown that there is considerable scope for the development of this model for TbCl3 and for other trivalent metal halide systems spanning a wide range of ion size ratios.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When designing a practical swarm robotics system, self-organized task allocation is key to make best use of resources. Current research in this area focuses on task allocation which is either distributed (tasks must be performed at different locations) or sequential (tasks are complex and must be split into simpler sub-tasks and processed in order). In practice, however, swarms will need to deal with tasks which are both distributed and sequential. In this paper, a classic foraging problem is extended to incorporate both distributed and sequential tasks. The problem is analysed theoretically, absolute limits on performance are derived, and a set of conditions for a successful algorithm are established. It is shown empirically that an algorithm which meets these conditions, by causing emergent cooperation between robots can achieve consistently high performance under a wide range of settings without the need for communication. © 2013 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Various room temperature ionic liquids (RTILs), notably, 1-methoxyethyl-3-methylimidazolium trifluoroacetate [MeOEtMIM]+[CF3COO]ˉ , have been used to promote the Knoevenagel condensation to afford substituted olefins. All reactions proceeded effectively in the absence of any other catalysts or co-solvents with good to excellent yields. This method is simple and applicable to reactions involving a wide range of aldehydes and ketones with methylene compounds. The ionic liquid can be recycled without noticeable reduction of its catalytic activity. A plausible reaction mechanism is proposed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Full text: The idea of producing proteins from recombinant DNA hatched almost half a century ago. In his PhD thesis, Peter Lobban foresaw the prospect of inserting foreign DNA (from any source, including mammalian cells) into the genome of a λ phage in order to detect and recover protein products from Escherichia coli [ 1 and 2]. Only a few years later, in 1977, Herbert Boyer and his colleagues succeeded in the first ever expression of a peptide-coding gene in E. coli — they produced recombinant somatostatin [ 3] followed shortly after by human insulin. The field has advanced enormously since those early days and today recombinant proteins have become indispensable in advancing research and development in all fields of the life sciences. Structural biology, in particular, has benefitted tremendously from recombinant protein biotechnology, and an overwhelming proportion of the entries in the Protein Data Bank (PDB) are based on heterologously expressed proteins. Nonetheless, synthesizing, purifying and stabilizing recombinant proteins can still be thoroughly challenging. For example, the soluble proteome is organized to a large part into multicomponent complexes (in humans often comprising ten or more subunits), posing critical challenges for recombinant production. A third of all proteins in cells are located in the membrane, and pose special challenges that require a more bespoke approach. Recent advances may now mean that even these most recalcitrant of proteins could become tenable structural biology targets on a more routine basis. In this special issue, we examine progress in key areas that suggests this is indeed the case. Our first contribution examines the importance of understanding quality control in the host cell during recombinant protein production, and pays particular attention to the synthesis of recombinant membrane proteins. A major challenge faced by any host cell factory is the balance it must strike between its own requirements for growth and the fact that its cellular machinery has essentially been hijacked by an expression construct. In this context, Bill and von der Haar examine emerging insights into the role of the dependent pathways of translation and protein folding in defining high-yielding recombinant membrane protein production experiments for the common prokaryotic and eukaryotic expression hosts. Rather than acting as isolated entities, many membrane proteins form complexes to carry out their functions. To understand their biological mechanisms, it is essential to study the molecular structure of the intact membrane protein assemblies. Recombinant production of membrane protein complexes is still a formidable, at times insurmountable, challenge. In these cases, extraction from natural sources is the only option to prepare samples for structural and functional studies. Zorman and co-workers, in our second contribution, provide an overview of recent advances in the production of multi-subunit membrane protein complexes and highlight recent achievements in membrane protein structural research brought about by state-of-the-art near-atomic resolution cryo-electron microscopy techniques. E. coli has been the dominant host cell for recombinant protein production. Nonetheless, eukaryotic expression systems, including yeasts, insect cells and mammalian cells, are increasingly gaining prominence in the field. The yeast species Pichia pastoris, is a well-established recombinant expression system for a number of applications, including the production of a range of different membrane proteins. Byrne reviews high-resolution structures that have been determined using this methylotroph as an expression host. Although it is not yet clear why P. pastoris is suited to producing such a wide range of membrane proteins, its ease of use and the availability of diverse tools that can be readily implemented in standard bioscience laboratories mean that it is likely to become an increasingly popular option in structural biology pipelines. The contribution by Columbus concludes the membrane protein section of this volume. In her overview of post-expression strategies, Columbus surveys the four most common biochemical approaches for the structural investigation of membrane proteins. Limited proteolysis has successfully aided structure determination of membrane proteins in many cases. Deglycosylation of membrane proteins following production and purification analysis has also facilitated membrane protein structure analysis. Moreover, chemical modifications, such as lysine methylation and cysteine alkylation, have proven their worth to facilitate crystallization of membrane proteins, as well as NMR investigations of membrane protein conformational sampling. Together these approaches have greatly facilitated the structure determination of more than 40 membrane proteins to date. It may be an advantage to produce a target protein in mammalian cells, especially if authentic post-translational modifications such as glycosylation are required for proper activity. Chinese Hamster Ovary (CHO) cells and Human Embryonic Kidney (HEK) 293 cell lines have emerged as excellent hosts for heterologous production. The generation of stable cell-lines is often an aspiration for synthesizing proteins expressed in mammalian cells, in particular if high volumetric yields are to be achieved. In his report, Buessow surveys recent structures of proteins produced using stable mammalian cells and summarizes both well-established and novel approaches to facilitate stable cell-line generation for structural biology applications. The ambition of many biologists is to observe a protein's structure in the native environment of the cell itself. Until recently, this seemed to be more of a dream than a reality. Advances in nuclear magnetic resonance (NMR) spectroscopy techniques, however, have now made possible the observation of mechanistic events at the molecular level of protein structure. Smith and colleagues, in an exciting contribution, review emerging ‘in-cell NMR’ techniques that demonstrate the potential to monitor biological activities by NMR in real time in native physiological environments. A current drawback of NMR as a structure determination tool derives from size limitations of the molecule under investigation and the structures of large proteins and their complexes are therefore typically intractable by NMR. A solution to this challenge is the use of selective isotope labeling of the target protein, which results in a marked reduction of the complexity of NMR spectra and allows dynamic processes even in very large proteins and even ribosomes to be investigated. Kerfah and co-workers introduce methyl-specific isotopic labeling as a molecular tool-box, and review its applications to the solution NMR analysis of large proteins. Tyagi and Lemke next examine single-molecule FRET and crosslinking following the co-translational incorporation of non-canonical amino acids (ncAAs); the goal here is to move beyond static snap-shots of proteins and their complexes and to observe them as dynamic entities. The encoding of ncAAs through codon-suppression technology allows biomolecules to be investigated with diverse structural biology methods. In their article, Tyagi and Lemke discuss these approaches and speculate on the design of improved host organisms for ‘integrative structural biology research’. Our volume concludes with two contributions that resolve particular bottlenecks in the protein structure determination pipeline. The contribution by Crepin and co-workers introduces the concept of polyproteins in contemporary structural biology. Polyproteins are widespread in nature. They represent long polypeptide chains in which individual smaller proteins with different biological function are covalently linked together. Highly specific proteases then tailor the polyprotein into its constituent proteins. Many viruses use polyproteins as a means of organizing their proteome. The concept of polyproteins has now been exploited successfully to produce hitherto inaccessible recombinant protein complexes. For instance, by means of a self-processing synthetic polyprotein, the influenza polymerase, a high-value drug target that had remained elusive for decades, has been produced, and its high-resolution structure determined. In the contribution by Desmyter and co-workers, a further, often imposing, bottleneck in high-resolution protein structure determination is addressed: The requirement to form stable three-dimensional crystal lattices that diffract incident X-ray radiation to high resolution. Nanobodies have proven to be uniquely useful as crystallization chaperones, to coax challenging targets into suitable crystal lattices. Desmyter and co-workers review the generation of nanobodies by immunization, and highlight the application of this powerful technology to the crystallography of important protein specimens including G protein-coupled receptors (GPCRs). Recombinant protein production has come a long way since Peter Lobban's hypothesis in the late 1960s, with recombinant proteins now a dominant force in structural biology. The contributions in this volume showcase an impressive array of inventive approaches that are being developed and implemented, ever increasing the scope of recombinant technology to facilitate the determination of elusive protein structures. Powerful new methods from synthetic biology are further accelerating progress. Structure determination is now reaching into the living cell with the ultimate goal of observing functional molecular architectures in action in their native physiological environment. We anticipate that even the most challenging protein assemblies will be tackled by recombinant technology in the near future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With an ageing population and increasing prevalence of central-nervous system (CNS) disorders new approaches are required to sustain the development and successful delivery of therapeutics into the brain and CNS. CNS drug delivery is challenging due to the impermeable nature of the brain microvascular endothelial cells that form the blood-brain barrier (BBB) and which prevent the entry of a wide range of therapeutics into the brain. This review examines the role intranasal delivery may play in achieving direct brain delivery, for small molecular weight drugs, macromolecular therapeutics and cell-based therapeutics, by exploitation of the olfactory and trigeminal nerve pathways. This approach is thought to deliver drugs into the brain and CNS through bypassing the BBB. Details of the mechanism of transfer of administrated therapeutics, the pathways that lead to brain deposition, with a specific focus on therapeutic pharmacokinetics, and examples of successful CNS delivery will be explored. © 2014 Bentham Science Publishers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We use the GN-model to assess Nyquist-WDM 100/200Gbit/s PM-QPSK/16QAM signal reach on low loss, large core area fibre using extended range, variable gain hybrid Raman-EDFAs. 5000/1500km transmission is possible over a wide range of amplifier spans. © OSA 2014.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective The aim of this study was to provide an initial insight into current UK paediatric prescribing practice. Methods In 2012 focus groups were conducted at Birmingham Children's Hospital (UK specialist hospital) with both medical and non-medical prescribers and analysed using thematic analysis. Key findings Both sets of prescribers used a wide range of resources to support their prescribing decisions. Dosing information was most commonly checked, and a lack of specialist paediatric information was reported in existing resources. All groups had high expectations of the support functions that should be included in an electronic prescribing system and could see many potential benefits. Participants agreed that all staff should see the same drug alerts. The overwhelming concern was whether the current information technology infrastructure would support electronic prescribing. Conclusions Prescribers had high expectations of electronic prescribing, but lacked confidence in its delivery. Prescribers use a wide range of resources to support their decision making when prescribing in paediatrics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Detonation nanodiamond (DND) is an attractive class of diamond material, which has a great potential to be used for a wide range of applications. In this paper, untreated DND was employed to perform hydrogen passivation process using microwave plasma enhanced chemical vapor deposition in order to investigate the influence of hydrogen-terminated surface on the DND's electrical properties. Impedance spectroscopy (IS) has been used to characterize the electrical properties of DND samples using a newly-developed measurement set-up. It is found that hydrogen-passivation process has increased the electrical conductivity of the DND by up to four orders of magnitude when compared with the untreated sample. An RC parallel equivalent circuit with a Warburg element has been proposed to model the DND's impedance characteristics. © 2012 Elsevier B.V. All rights reserved.