540 resultados para Bespoke textiles
Resumo:
In this paper, we use plant-level data from two Indian industries, namely, electrical machinery and textiles, to examine the empirical relationship between structural reforms like abandonment of entry restrictions to the product market, competition and firm-level productivity and efficiency. These industries have faced different sets of policies since Independence but both were restricted in the adoption of technology and in the development of optimal scales of production. They also belonged to the first set of industries that benefited from the liberalization process started in the 1980s. Our results suggest that both the industries have improved their efficiency and scales of operation by the turn of the century. However, the process of adjustment seems to have been worked out more fully for electrical machinery. We also find evidence of spatial fragmentation of the market as late as 2000–2001. Gains in labour productivity were much more evident in states that either have a strong history of industrial activity or those that have experienced significant improvements in business environment since 1991.
Resumo:
We explore the causal links between service firms' knowledge investments, their innovation outputs and business growth based on a bespoke survey of around 1100 UK service businesses. We combine the activity based approach of the innovation value chain with firms' external links at each stage of the innovation process. This introduces the concept of 'encoding' relationships through which learning improves the effectiveness of firms' innovation processes. Our econometric results emphasise the importance of external openness in the initial, exploratory phase of the innovation process and the significance of internal openness (e.g. team working) in later stages of the process. In-house design capacity is strongly linked to a firm's ability to absorb external knowledge for innovation. Links to customers are important in the exploratory stage of the innovation process, but encoding linkages with private and public research organisations are more important in developing innovation outputs. Business growth is related directly to both the extent of firms' service innovation as well as the diversity of innovation, reflecting marketing, strategic and business process change.
Resumo:
Golfers, coaches and researchers alike, have all keyed in on golf putting as an important aspect of overall golf performance. Of the three principle putting tasks (green reading, alignment and the putting action phase), the putting action phase has attracted the most attention from coaches, players and researchers alike. This phase includes the alignment of the club with the ball, the swing, and ball contact. A significant amount of research in this area has focused on measuring golfer’s vision strategies with eye tracking equipment. Unfortunately this research suffers from a number of shortcomings, which limit its usefulness. The purpose of this thesis was to address some of these shortcomings. The primary objective of this thesis was to re-evaluate golfer’s putting vision strategies using binocular eye tracking equipment and to define a new, optimal putting vision strategy which was associated with both higher skill and success. In order to facilitate this research, bespoke computer software was developed and validated, and new gaze behaviour criteria were defined. Additionally, the effects of training (habitual) and competition conditions on the putting vision strategy were examined, as was the effect of ocular dominance. Finally, methods for improving golfer’s binocular vision strategies are discussed, and a clinical plan for the optometric management of the golfer’s vision is presented. The clinical management plan includes the correction of fundamental aspects of golfers’ vision, including monocular refractive errors and binocular vision defects, as well as enhancement of their putting vision strategy, with the overall aim of improving performance on the golf course. This research has been undertaken in order to gain a better understanding of the human visual system and how it relates to the sport performance of golfers specifically. Ultimately, the analysis techniques and methods developed are applicable to the assessment of visual performance in all sports.
Resumo:
Accommodating Intraocular Lenses (IOLs), multifocal IOLs (MIOLs) and toric IOLs are designed to provide a greater level of spectacle independency post cataract surgery. All of these IOLs are reliant on the accurate calculation of intraocular lens power determined through reliable ocular biometry. A standardised defocus area metric and reading performance index metric were devised for the evaluation of the range of focus and the reading ability of subjects implanted with presbyopic correcting IOLs. The range of clear vision after implantation of an MIOL is extended by a second focal point; however, this results in the prevalence of dysphotopsia. A bespoke halometer was designed and validated to assess this photopic phenomenon. There is a lack of standardisation in the methods used for determining IOL orientation and thus rotation. A repeatable, objective method was developed to allow the accurate assessment of IOL rotation, which was used to determine the rotational and positional stability of a closed loop haptic IOL. A new commercially available biometry device was validated for use with subjects prior to cataract surgery. The optical low coherence reflectometry instrument proved to be a valid method for assessing ocular biometry and covered a wider range of ocular parameters in comparison with previous instruments. The advantages of MIOLs were shown to include an extended range of clear vision translating into greater reading ability. However, an increased prevalence of dysphotopsia was shown with a bespoke halometer, which was dependent on the MIOL optic design. Implantation of a single optic accommodating IOL did not improve reading ability but achieved high subjective ratings of near vision. The closed-loop haptic IOL displayed excellent rotational stability in the late period but relatively poor rotational stability in the early period post implantation. The orientation error was compounded by the high frequency of positional misalignment leading to an extensive overall misalignment of the IOL. This thesis demonstrates the functionality of new IOL lens designs and the importance of standardised testing methods, thus providing a greater understanding of the consequences of implanting these IOLs. Consequently, the findings of the thesis will influence future designs of IOLs and testing methods.
Resumo:
Purpose. To examine the influence of positional misalignments on intraocular pressure (IOP) measurement with a rebound tonometer. Methods. Using the iCare rebound tonometer, IOP readings were taken from the right eye of 36 healthy subjects at the central corneal apex (CC) and compared to IOP measures using the Goldmann applanation tonometer (GAT). Using a bespoke rig, iCare IOP readings were also taken 2 mm laterally from CC, both nasally and temporally, along with angular deviations of 5 and 10 degrees, both nasally and temporally to the visual axis. Results. Mean IOP ± SD, as measured by GAT, was 14.7±2.5 mmHg versus iCare tonometer readings of 17.4±3.6 mmHg at CC, representing an iCare IOP overestimation of 2.7±2.8 mmHg (P<0.001), which increased at higher average IOPs. IOP at CC using the iCare tonometer was not significantly different to values at lateral displacements. IOP was marginally underestimated with angular deviation of the probe but only reaching significance at 10 degrees nasally. Conclusions. As shown previously, the iCare tonometer overestimates IOP compared to GAT. However, IOP measurement in normal, healthy subjects using the iCare rebound tonometer appears insensitive to misalignments. An IOP underestimation of <1 mmHg with the probe deviated 10 degrees nasally reached statistical but not clinical significance levels. © 2013 Ian G. Beasley et al.
An agent approach to improving radio frequency identification enabled Returnable Transport Equipment
Resumo:
Returnable transport equipment (RTE) such as pallets form an integral part of the supply chain and poor management leads to costly losses. Companies often address this matter by outsourcing the management of RTE to logistics service providers (LSPs). LSPs are faced with the task to provide logistical expertise to reduce RTE related waste, whilst differentiating their own services to remain competitive. In the current challenging economic climate, the role of the LSP to deliver innovative ways to achieve competitive advantage has never been so important. It is reported that radio frequency identification (RFID) application to RTE enables LSPs such as DHL to gain competitive advantage and offer clients improvements such as loss reduction, process efficiency improvement and effective security. However, the increased visibility and functionality of RFID enabled RTE requires further investigation in regards to decision‐making. The distributed nature of the RTE network favours a decentralised decision‐making format. Agents are an effective way to represent objects from the bottom‐up, capturing the behaviour and enabling localised decision‐making. Therefore, an agent based system is proposed to represent the RTE network and utilise the visibility and data gathered from RFID tags. Two types of agents are developed in order to represent the trucks and RTE, which have bespoke rules and algorithms in order to facilitate negotiations. The aim is to create schedules, which integrate RTE pick‐ups as the trucks go back to the depot. The findings assert that: - agent based modelling provides an autonomous tool, which is effective in modelling RFID enabled RTE in a decentralised utilising the real‐time data facility. ‐ the RFID enabled RTE model developed enables autonomous agent interaction, which leads to a feasible schedule integrating both forward and reverse flows for each RTE batch. ‐ the RTE agent scheduling algorithm developed promotes the utilisation of RTE by including an automatic return flow for each batch of RTE, whilst considering the fleet costs andutilisation rates. ‐ the research conducted contributes an agent based platform, which LSPs can use in order to assess the most appropriate strategies to implement for RTE network improvement for each of their clients.
Resumo:
In this paper, the authors use an exponential generalized autoregressive conditional heteroscedastic (EGARCH) error-correction model (ECM), that is, EGARCH-ECM, to estimate the pass-through effects of foreign exchange (FX) rates and producers’ prices for 20 U.K. export sectors. The long-run adjustment of export prices to FX rates and producers’ prices is within the range of -1.02% (for the Textiles sector) and -17.22% (for the Meat sector). The contemporaneous pricing-to-market (PTM) coefficient is within the range of -72.84% (for the Fuels sector) and -8.05% (for the Textiles sector). Short-run FX rate pass-through is not complete even after several months. Rolling EGARCH-ECMs show that the short and long-run effects of FX rate and producers’ prices fluctuate substantially as are asymmetry and volatility estimates before equilibrium is achieved.
Resumo:
This paper builds on previous work (Clark, 2009; Clark & Andrews 2011, 2014) to continue the debate around a seemingly universal question…“How can educational theory be applied to engineering education in such a way so as to make the subject more accessible and attractive to students? It argues that there are three key elements to student success; Relationships, Variety & Synergy (RVS). By further examining the purposefully developed bespoke learning and teaching approach constructed around these three elements (RVS) the discourse in this paper links educational theory to engineering education and in doing so further develops arguments for the introduction of a purposefully designed pedagogic approach for use in engineering education.
Resumo:
This paper builds on previous work (Clark, 2009; Clark & Andrews 2011, 2014) to continue the debate around a seemingly universal question…“How can educational theory be applied to engineering education in such a way so as to make the subject more accessible and attractive to students? It argues that there are three key elements to student success; Relationships, Variety & Synergy (RVS). By further examining the purposefully developed bespoke learning and teaching approach constructed around these three elements (RVS) the discourse in this paper links educational theory to engineering education and in doing so further develops arguments for the introduction of a purposefully designed pedagogic approach for use in engineering education.
Resumo:
The stylized literature on foreign direct investment (FDI) suggests that developing countries should invest in the human capital of their labor force in order to attract FDI. However, if educational quality in developing country is uncertain such that formal education is a noisy signal of human capital, it might be rational for multinational enterprises to focus more on job-specific training than on formal education of the labor force. Using cross-country data from the textiles and garments industry, we demonstrate that training indeed has a greater impact on firm efficiency in developing countries than formal education of the workforce. © 2013 John Wiley & Sons Ltd.
Resumo:
JenPep is a relational database containing a compendium of thermodynamic binding data for the interaction of peptides with a range of important immunological molecules: the major histocompatibility complex, TAP transporter, and T cell receptor. The database also includes annotated lists of B cell and T cell epitopes. Version 2.0 of the database is implemented in a bespoke postgreSQL database system and is fully searchable online via a perl/HTML interface (URL: http://www.jenner.ac.uk/JenPep).
Resumo:
Full text: The idea of producing proteins from recombinant DNA hatched almost half a century ago. In his PhD thesis, Peter Lobban foresaw the prospect of inserting foreign DNA (from any source, including mammalian cells) into the genome of a λ phage in order to detect and recover protein products from Escherichia coli [ 1 and 2]. Only a few years later, in 1977, Herbert Boyer and his colleagues succeeded in the first ever expression of a peptide-coding gene in E. coli — they produced recombinant somatostatin [ 3] followed shortly after by human insulin. The field has advanced enormously since those early days and today recombinant proteins have become indispensable in advancing research and development in all fields of the life sciences. Structural biology, in particular, has benefitted tremendously from recombinant protein biotechnology, and an overwhelming proportion of the entries in the Protein Data Bank (PDB) are based on heterologously expressed proteins. Nonetheless, synthesizing, purifying and stabilizing recombinant proteins can still be thoroughly challenging. For example, the soluble proteome is organized to a large part into multicomponent complexes (in humans often comprising ten or more subunits), posing critical challenges for recombinant production. A third of all proteins in cells are located in the membrane, and pose special challenges that require a more bespoke approach. Recent advances may now mean that even these most recalcitrant of proteins could become tenable structural biology targets on a more routine basis. In this special issue, we examine progress in key areas that suggests this is indeed the case. Our first contribution examines the importance of understanding quality control in the host cell during recombinant protein production, and pays particular attention to the synthesis of recombinant membrane proteins. A major challenge faced by any host cell factory is the balance it must strike between its own requirements for growth and the fact that its cellular machinery has essentially been hijacked by an expression construct. In this context, Bill and von der Haar examine emerging insights into the role of the dependent pathways of translation and protein folding in defining high-yielding recombinant membrane protein production experiments for the common prokaryotic and eukaryotic expression hosts. Rather than acting as isolated entities, many membrane proteins form complexes to carry out their functions. To understand their biological mechanisms, it is essential to study the molecular structure of the intact membrane protein assemblies. Recombinant production of membrane protein complexes is still a formidable, at times insurmountable, challenge. In these cases, extraction from natural sources is the only option to prepare samples for structural and functional studies. Zorman and co-workers, in our second contribution, provide an overview of recent advances in the production of multi-subunit membrane protein complexes and highlight recent achievements in membrane protein structural research brought about by state-of-the-art near-atomic resolution cryo-electron microscopy techniques. E. coli has been the dominant host cell for recombinant protein production. Nonetheless, eukaryotic expression systems, including yeasts, insect cells and mammalian cells, are increasingly gaining prominence in the field. The yeast species Pichia pastoris, is a well-established recombinant expression system for a number of applications, including the production of a range of different membrane proteins. Byrne reviews high-resolution structures that have been determined using this methylotroph as an expression host. Although it is not yet clear why P. pastoris is suited to producing such a wide range of membrane proteins, its ease of use and the availability of diverse tools that can be readily implemented in standard bioscience laboratories mean that it is likely to become an increasingly popular option in structural biology pipelines. The contribution by Columbus concludes the membrane protein section of this volume. In her overview of post-expression strategies, Columbus surveys the four most common biochemical approaches for the structural investigation of membrane proteins. Limited proteolysis has successfully aided structure determination of membrane proteins in many cases. Deglycosylation of membrane proteins following production and purification analysis has also facilitated membrane protein structure analysis. Moreover, chemical modifications, such as lysine methylation and cysteine alkylation, have proven their worth to facilitate crystallization of membrane proteins, as well as NMR investigations of membrane protein conformational sampling. Together these approaches have greatly facilitated the structure determination of more than 40 membrane proteins to date. It may be an advantage to produce a target protein in mammalian cells, especially if authentic post-translational modifications such as glycosylation are required for proper activity. Chinese Hamster Ovary (CHO) cells and Human Embryonic Kidney (HEK) 293 cell lines have emerged as excellent hosts for heterologous production. The generation of stable cell-lines is often an aspiration for synthesizing proteins expressed in mammalian cells, in particular if high volumetric yields are to be achieved. In his report, Buessow surveys recent structures of proteins produced using stable mammalian cells and summarizes both well-established and novel approaches to facilitate stable cell-line generation for structural biology applications. The ambition of many biologists is to observe a protein's structure in the native environment of the cell itself. Until recently, this seemed to be more of a dream than a reality. Advances in nuclear magnetic resonance (NMR) spectroscopy techniques, however, have now made possible the observation of mechanistic events at the molecular level of protein structure. Smith and colleagues, in an exciting contribution, review emerging ‘in-cell NMR’ techniques that demonstrate the potential to monitor biological activities by NMR in real time in native physiological environments. A current drawback of NMR as a structure determination tool derives from size limitations of the molecule under investigation and the structures of large proteins and their complexes are therefore typically intractable by NMR. A solution to this challenge is the use of selective isotope labeling of the target protein, which results in a marked reduction of the complexity of NMR spectra and allows dynamic processes even in very large proteins and even ribosomes to be investigated. Kerfah and co-workers introduce methyl-specific isotopic labeling as a molecular tool-box, and review its applications to the solution NMR analysis of large proteins. Tyagi and Lemke next examine single-molecule FRET and crosslinking following the co-translational incorporation of non-canonical amino acids (ncAAs); the goal here is to move beyond static snap-shots of proteins and their complexes and to observe them as dynamic entities. The encoding of ncAAs through codon-suppression technology allows biomolecules to be investigated with diverse structural biology methods. In their article, Tyagi and Lemke discuss these approaches and speculate on the design of improved host organisms for ‘integrative structural biology research’. Our volume concludes with two contributions that resolve particular bottlenecks in the protein structure determination pipeline. The contribution by Crepin and co-workers introduces the concept of polyproteins in contemporary structural biology. Polyproteins are widespread in nature. They represent long polypeptide chains in which individual smaller proteins with different biological function are covalently linked together. Highly specific proteases then tailor the polyprotein into its constituent proteins. Many viruses use polyproteins as a means of organizing their proteome. The concept of polyproteins has now been exploited successfully to produce hitherto inaccessible recombinant protein complexes. For instance, by means of a self-processing synthetic polyprotein, the influenza polymerase, a high-value drug target that had remained elusive for decades, has been produced, and its high-resolution structure determined. In the contribution by Desmyter and co-workers, a further, often imposing, bottleneck in high-resolution protein structure determination is addressed: The requirement to form stable three-dimensional crystal lattices that diffract incident X-ray radiation to high resolution. Nanobodies have proven to be uniquely useful as crystallization chaperones, to coax challenging targets into suitable crystal lattices. Desmyter and co-workers review the generation of nanobodies by immunization, and highlight the application of this powerful technology to the crystallography of important protein specimens including G protein-coupled receptors (GPCRs). Recombinant protein production has come a long way since Peter Lobban's hypothesis in the late 1960s, with recombinant proteins now a dominant force in structural biology. The contributions in this volume showcase an impressive array of inventive approaches that are being developed and implemented, ever increasing the scope of recombinant technology to facilitate the determination of elusive protein structures. Powerful new methods from synthetic biology are further accelerating progress. Structure determination is now reaching into the living cell with the ultimate goal of observing functional molecular architectures in action in their native physiological environment. We anticipate that even the most challenging protein assemblies will be tackled by recombinant technology in the near future.
Resumo:
The cardiovascular health of the human population is a major concern for medical clinicians, with cardiovascular diseases responsible for 48% of all deaths worldwide, according to the World Health Organization. The development of new diagnostic tools that are practicable and economical to scrutinize the cardiovascular health of humans is a major driver for clinicians. We offer a new technique to obtain seismocardiographic signals up to 54 Hz covering both ballistocardiography (below 20 Hz) and audible heart sounds (20 Hz upward), using a system based on curvature sensors formed from fiber optic long period gratings. This system can visualize the real-time three-dimensional (3-D) mechanical motion of the heart by using the data from the sensing array in conjunction with a bespoke 3-D shape reconstruction algorithm. Visualization is demonstrated by adhering three to four sensors on the outside of the thorax and in close proximity to the apex of the heart; the sensing scheme revealed a complex motion of the heart wall next to the apex region of the heart. The detection scheme is low-cost, portable, easily operated and has the potential for ambulatory applications.
Resumo:
Purpose: To describe and validate bespoke software designed to extract morphometric data from ciliary muscle Visante Anterior Segment Optical Coherence Tomography (AS-OCT) images. Method: Initially, to ensure the software was capable of appropriately applying tiered refractive index corrections and accurately measuring orthogonal and oblique parameters, 5 sets of custom-made rigid gas-permeable lenses aligned to simulate the sclera and ciliary muscle were imaged by the Visante AS-OCT and were analysed by the software. Human temporal ciliary muscle data from 50 participants extracted via the internal Visante AS-OCT caliper method and the software were compared. The repeatability of the software was also investigated by imaging the temporal ciliary muscle of 10 participants on 2 occasions. Results: The mean difference between the software and the absolute thickness measurements of the rigid gas-permeable lenses were not statistically significantly different from 0 (t = -1.458, p = 0.151). Good correspondence was observed between human ciliary muscle measurements obtained by the software and the internal Visante AS-OCT calipers (maximum thickness t = -0.864, p = 0.392, total length t = 0.860, p = 0.394). The software extracted highly repeatable ciliary muscle measurements (variability ≤6% of mean value). Conclusion: The bespoke software is capable of extracting accurate and repeatable ciliary muscle measurements and is suitable for analysing large data sets.
Resumo:
The sharing of near real-time traceability knowledge in supply chains plays a central role in coordinating business operations and is a key driver for their success. However before traceability datasets received from external partners can be integrated with datasets generated internally within an organisation, they need to be validated against information recorded for the physical goods received as well as against bespoke rules defined to ensure uniformity, consistency and completeness within the supply chain. In this paper, we present a knowledge driven framework for the runtime validation of critical constraints on incoming traceability datasets encapuslated as EPCIS event-based linked pedigrees. Our constraints are defined using SPARQL queries and SPIN rules. We present a novel validation architecture based on the integration of Apache Storm framework for real time, distributed computation with popular Semantic Web/Linked data libraries and exemplify our methodology on an abstraction of the pharmaceutical supply chain.