904 resultados para encoding of measurement streams


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fibre-optic communications systems have traditionally carried data using binary (on-off) encoding of the light amplitude. However, next-generation systems will use both the amplitude and phase of the optical carrier to achieve higher spectral efficiencies and thus higher overall data capacities(1,2). Although this approach requires highly complex transmitters and receivers, the increased capacity and many further practical benefits that accrue from a full knowledge of the amplitude and phase of the optical field(3) more than outweigh this additional hardware complexity and can greatly simplify optical network design. However, use of the complex optical field gives rise to a new dominant limitation to system performance-nonlinear phase noise(4,5). Developing a device to remove this noise is therefore of great technical importance. Here, we report the development of the first practical ('black-box') all-optical regenerator capable of removing both phase and amplitude noise from binary phase-encoded optical communications signals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Die vorliegende Studie prüft Zusammenhänge zwischen Arbeitsintensität, Tätigkeitsspielraum, sozialer Arbeitsumgebung (Kooperation/Kommunikation, soziale Unterstützung, soziale Stressoren) und Stresserleben am Arbeitsplatz mit der basalen Cortisolsekretion im Speichel (Tagesprofil, Aufwachreaktion und Variation über den Tag). Insgesamt 46 Erwerbstätige aus dem Bankwesen sammelten an zwei aufeinander folgenden Arbeitstagen je vier Speichelproben (beim Aufwachen, 30 min nach dem Aufwachen, 14 Uhr und unmittelbar vor dem Zubettgehen), aus denen die individuelle Cortisolkonzentration (Mittelwert aus den jeweils zugehörigen Proben) bestimmt wurde. Die Tätigkeitsmerkmale wurden sowohl mittels Fragebögen als auch objektiv, d.?h. unabhängig vom Arbeitsplatzinhaber, erhoben. Alter, Geschlecht, Rauchen, Body-Mass-Index, gesundheitliche Beeinträchtigungen sowie eventuelle Abweichungen bei der Probensammlung wurden als mögliche Drittvariablen berücksichtigt. Im Ergebnis zeigte sich, dass subjektiv erlebte, geringe soziale Unterstützung und hohe soziale Stressoren mit einer erhöhten Aufwachreaktion bzw. mit einer erhöhten Variation über den Tag assoziiert waren. Für die Arbeitsintensität, den Tätigkeitsspielraum sowie für die objektiv erhobene Kooperation/Kommunikation waren keine Effekte nachweisbar. Die Ergebnisse lassen vermuten, dass sowohl die Belastungs- als auch deren Erhebungsart für den Nachweis von Effekten im Hinblick auf die Cortisolsekretion bei Erwerbstätigen von Bedeutung sind. The present study examines associations between job demands, job control, social work environment (cooperation/communication, social support, social stressors), and strain at work with basal salivary cortisol (day profiles, cortisol awakening reaction, diurnal variation). Forty-six employees collected four saliva samples (immediately after waking up, 30 min after waking up, at 2 p.m. and immediately before going to bed) each on two consecutive working days. We computed the mean across the two days for each of the four saliva samples per employee. Job characteristics were assessed by self-reports as well as by objective job analysis. Analyses were controlled for possible confounding effects of age, gender, smoking, body-mass index, health impairments, and non compliance with the cortisol protocol. Results show that subjectively experienced low social support and high social stressors at work were associated with elevated cortisol awakening reaction and elevated diurnal variation. We found no effects for job demands, job control or objectively assessed cooperation/communication. Our results suggest that both the type of job characteristic as well as the type of measurement of job characteristics have to be taken into account when relating them to employees’ cortisol secretion.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Retinal Vessel Analyser (RVA) is a commercially available ophthalmoscopic instrument capable of acquiring vessel diameter fluctuations in real time and in high temporal resolution. Visual stimulation by means of flickering light is a unique exploration tool of neurovascular coupling in the human retina. Vessel reactivity as mediated by local vascular endothelial vasodilators and vasoconstrictors can be assessed non-invasively, in vivo. In brief, the work in this thesis • deals with interobserver and intraobserver reproducibility of the flicker responses in healthy volunteers • explains the superiority of individually analysed reactivity parameters over vendorgenerated output • links in static retinal measures with dynamic ones • highlights practical limitations in the use of the RVA that may undermine its clinical usefulness • provides recommendations for standardising measurements in terms of vessel location and vessel segment length and • presents three case reports of essential hypertensives in a -year follow-up. Strict standardisation of measurement procedures is a necessity when utilising the RVA system. Agreement between research groups on implemented protocols needs to be met, before it could be considered a clinically useful tool in detecting or predicting microvascular dysfunction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes the horizontal deflection behaviour of the streams of particles in paramagnetic fluids under a high-gradient superconducting magnetic field, which is the continued work on the exploration of particle magneto-Archimedes levitation. Based on the previous work on the horizontal deflection of a single particle, a glass box and collector had been designed to observe the movement of particle group in paramagnetic fluids. To get the exact separation efficiency, the method of "sink-float" involved the high density fluid polytungstate (dense medium separation) and MLA (Mineral Liberation Analyser) was performed. It was found that the particles were deflected and settled at certain positions on the container floor due to the combined forces of gravity and magneto-Archimedes forces as well as a lateral buoyancy (displacement) force. Mineral particles with different densities and susceptibilities could be deflected to different positions, thus producing groups of similar types of particles. The work described here, although in its infancy, could form the basis of new approach of separating particles based on a combination of susceptibility and density. © 2014 Elsevier B.V.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We agree with de Jong et al.'s argument that business historians should make their methods more explicit and welcome a more general debate about the most appropriate methods for business historical research. But rather than advocating one ‘new business history’, we argue that contemporary debates about methodology in business history need greater appreciation for the diversity of approaches that have developed in the last decade. And while the hypothesis-testing framework prevalent in the mainstream social sciences favoured by de Jong et al. should have its place among these methodologies, we identify a number of additional streams of research that can legitimately claim to have contributed novel methodological insights by broadening the range of interpretative and qualitative approaches to business history. Thus, we reject privileging a single method, whatever it may be, and argue instead in favour of recognising the plurality of methods being developed and used by business historians – both within their own field and as a basis for interactions with others.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper introduces an encoding of knowledge representation statements as regular languages and proposes a two-phase approach to processing of explicitly declared conceptual information. The idea is presented for the simple conceptual graphs where conceptual pattern search is implemented by the so called projection operation. Projection calculations are organised into off-line preprocessing and run-time computations. This enables fast run-time treatment of NP-complete problems, given that the intermediate results of the off-line phase are kept in suitable data structures. The experiments with randomly-generated, middle-size knowledge bases support the claim that the suggested approach radically improves the run-time conceptual pattern search.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper presents a case study of geo-monitoring a region consisting in the capturing and encoding of human expertise into a knowledge-based system. As soon as the maps have been processed, the data patterns are detected using knowledge-based agents for the harvest prognosis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a novel error-free (infinite-precision) architecture for the fast implementation of 8x8 2-D Discrete Cosine Transform. The architecture uses a new algebraic integer encoding of a 1-D radix-8 DCT that allows the separable computation of a 2-D 8x8 DCT without any intermediate number representation conversions. This is a considerable improvement on previously introduced algebraic integer encoding techniques to compute both DCT and IDCT which eliminates the requirements to approximate the transformation matrix ele- ments by obtaining their exact representations and hence mapping the transcendental functions without any errors. Apart from the multiplication-free nature, this new mapping scheme fits to this algorithm, eliminating any computational or quantization errors and resulting short-word-length and high-speed-design.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose is to develop expert systems where by-analogy reasoning is used. Knowledge “closeness” problems are known to frequently emerge in such systems if knowledge is represented by different production rules. To determine a degree of closeness for production rules a distance between predicates is introduced. Different types of distances between two predicate value distribution functions are considered when predicates are “true”. Asymptotic features and interrelations of distances are studied. Predicate value distribution functions are found by empirical distribution functions, and a procedure is proposed for this purpose. An adequacy of obtained distribution functions is tested on the basis of the statistical 2 χ –criterion and a testing mechanism is discussed. A theorem, by which a simple procedure of measurement of Euclidean distances between distribution function parameters is substituted for a predicate closeness determination one, is proved for parametric distribution function families. The proposed distance measurement apparatus may be applied in expert systems when reasoning is created by analogy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Full text: The idea of producing proteins from recombinant DNA hatched almost half a century ago. In his PhD thesis, Peter Lobban foresaw the prospect of inserting foreign DNA (from any source, including mammalian cells) into the genome of a λ phage in order to detect and recover protein products from Escherichia coli [ 1 and 2]. Only a few years later, in 1977, Herbert Boyer and his colleagues succeeded in the first ever expression of a peptide-coding gene in E. coli — they produced recombinant somatostatin [ 3] followed shortly after by human insulin. The field has advanced enormously since those early days and today recombinant proteins have become indispensable in advancing research and development in all fields of the life sciences. Structural biology, in particular, has benefitted tremendously from recombinant protein biotechnology, and an overwhelming proportion of the entries in the Protein Data Bank (PDB) are based on heterologously expressed proteins. Nonetheless, synthesizing, purifying and stabilizing recombinant proteins can still be thoroughly challenging. For example, the soluble proteome is organized to a large part into multicomponent complexes (in humans often comprising ten or more subunits), posing critical challenges for recombinant production. A third of all proteins in cells are located in the membrane, and pose special challenges that require a more bespoke approach. Recent advances may now mean that even these most recalcitrant of proteins could become tenable structural biology targets on a more routine basis. In this special issue, we examine progress in key areas that suggests this is indeed the case. Our first contribution examines the importance of understanding quality control in the host cell during recombinant protein production, and pays particular attention to the synthesis of recombinant membrane proteins. A major challenge faced by any host cell factory is the balance it must strike between its own requirements for growth and the fact that its cellular machinery has essentially been hijacked by an expression construct. In this context, Bill and von der Haar examine emerging insights into the role of the dependent pathways of translation and protein folding in defining high-yielding recombinant membrane protein production experiments for the common prokaryotic and eukaryotic expression hosts. Rather than acting as isolated entities, many membrane proteins form complexes to carry out their functions. To understand their biological mechanisms, it is essential to study the molecular structure of the intact membrane protein assemblies. Recombinant production of membrane protein complexes is still a formidable, at times insurmountable, challenge. In these cases, extraction from natural sources is the only option to prepare samples for structural and functional studies. Zorman and co-workers, in our second contribution, provide an overview of recent advances in the production of multi-subunit membrane protein complexes and highlight recent achievements in membrane protein structural research brought about by state-of-the-art near-atomic resolution cryo-electron microscopy techniques. E. coli has been the dominant host cell for recombinant protein production. Nonetheless, eukaryotic expression systems, including yeasts, insect cells and mammalian cells, are increasingly gaining prominence in the field. The yeast species Pichia pastoris, is a well-established recombinant expression system for a number of applications, including the production of a range of different membrane proteins. Byrne reviews high-resolution structures that have been determined using this methylotroph as an expression host. Although it is not yet clear why P. pastoris is suited to producing such a wide range of membrane proteins, its ease of use and the availability of diverse tools that can be readily implemented in standard bioscience laboratories mean that it is likely to become an increasingly popular option in structural biology pipelines. The contribution by Columbus concludes the membrane protein section of this volume. In her overview of post-expression strategies, Columbus surveys the four most common biochemical approaches for the structural investigation of membrane proteins. Limited proteolysis has successfully aided structure determination of membrane proteins in many cases. Deglycosylation of membrane proteins following production and purification analysis has also facilitated membrane protein structure analysis. Moreover, chemical modifications, such as lysine methylation and cysteine alkylation, have proven their worth to facilitate crystallization of membrane proteins, as well as NMR investigations of membrane protein conformational sampling. Together these approaches have greatly facilitated the structure determination of more than 40 membrane proteins to date. It may be an advantage to produce a target protein in mammalian cells, especially if authentic post-translational modifications such as glycosylation are required for proper activity. Chinese Hamster Ovary (CHO) cells and Human Embryonic Kidney (HEK) 293 cell lines have emerged as excellent hosts for heterologous production. The generation of stable cell-lines is often an aspiration for synthesizing proteins expressed in mammalian cells, in particular if high volumetric yields are to be achieved. In his report, Buessow surveys recent structures of proteins produced using stable mammalian cells and summarizes both well-established and novel approaches to facilitate stable cell-line generation for structural biology applications. The ambition of many biologists is to observe a protein's structure in the native environment of the cell itself. Until recently, this seemed to be more of a dream than a reality. Advances in nuclear magnetic resonance (NMR) spectroscopy techniques, however, have now made possible the observation of mechanistic events at the molecular level of protein structure. Smith and colleagues, in an exciting contribution, review emerging ‘in-cell NMR’ techniques that demonstrate the potential to monitor biological activities by NMR in real time in native physiological environments. A current drawback of NMR as a structure determination tool derives from size limitations of the molecule under investigation and the structures of large proteins and their complexes are therefore typically intractable by NMR. A solution to this challenge is the use of selective isotope labeling of the target protein, which results in a marked reduction of the complexity of NMR spectra and allows dynamic processes even in very large proteins and even ribosomes to be investigated. Kerfah and co-workers introduce methyl-specific isotopic labeling as a molecular tool-box, and review its applications to the solution NMR analysis of large proteins. Tyagi and Lemke next examine single-molecule FRET and crosslinking following the co-translational incorporation of non-canonical amino acids (ncAAs); the goal here is to move beyond static snap-shots of proteins and their complexes and to observe them as dynamic entities. The encoding of ncAAs through codon-suppression technology allows biomolecules to be investigated with diverse structural biology methods. In their article, Tyagi and Lemke discuss these approaches and speculate on the design of improved host organisms for ‘integrative structural biology research’. Our volume concludes with two contributions that resolve particular bottlenecks in the protein structure determination pipeline. The contribution by Crepin and co-workers introduces the concept of polyproteins in contemporary structural biology. Polyproteins are widespread in nature. They represent long polypeptide chains in which individual smaller proteins with different biological function are covalently linked together. Highly specific proteases then tailor the polyprotein into its constituent proteins. Many viruses use polyproteins as a means of organizing their proteome. The concept of polyproteins has now been exploited successfully to produce hitherto inaccessible recombinant protein complexes. For instance, by means of a self-processing synthetic polyprotein, the influenza polymerase, a high-value drug target that had remained elusive for decades, has been produced, and its high-resolution structure determined. In the contribution by Desmyter and co-workers, a further, often imposing, bottleneck in high-resolution protein structure determination is addressed: The requirement to form stable three-dimensional crystal lattices that diffract incident X-ray radiation to high resolution. Nanobodies have proven to be uniquely useful as crystallization chaperones, to coax challenging targets into suitable crystal lattices. Desmyter and co-workers review the generation of nanobodies by immunization, and highlight the application of this powerful technology to the crystallography of important protein specimens including G protein-coupled receptors (GPCRs). Recombinant protein production has come a long way since Peter Lobban's hypothesis in the late 1960s, with recombinant proteins now a dominant force in structural biology. The contributions in this volume showcase an impressive array of inventive approaches that are being developed and implemented, ever increasing the scope of recombinant technology to facilitate the determination of elusive protein structures. Powerful new methods from synthetic biology are further accelerating progress. Structure determination is now reaching into the living cell with the ultimate goal of observing functional molecular architectures in action in their native physiological environment. We anticipate that even the most challenging protein assemblies will be tackled by recombinant technology in the near future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Authors analyses questions of the subjective uncertainty and inexactness situations in the moment of using expert information and another questions which are connected with expert information uncertainty by fuzzy sets with rough membership functions in this article. You can find information about integral problems of individual expert marks and about connection among total marks “degree of inexactness” with sensibility of measurement scale. A lot of different situation which are connected with distribution of the function accessory significance and orientation of the concrete take to task decision making are analyses here.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we propose a model of encoding data into DNA strands so that this data can be used in the simulation of a genetic algorithm based on molecular operations. DNA computing is an impressive computational model that needs algorithms to work properly and efficiently. The first problem when trying to apply an algorithm in DNA computing must be how to codify the data that the algorithm will use. In a genetic algorithm the first objective must be to codify the genes, which are the main data. A concrete encoding of the genes in a single DNA strand is presented and we discuss what this codification is suitable for. Previous work on DNA coding defined bond-free languages which several properties assuring the stability of any DNA word of such a language. We prove that a bond-free language is necessary but not sufficient to codify a gene giving the correct codification.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Measurement assisted assembly (MAA) has the potential to facilitate a step change in assembly efficiency for large structures such as airframes through the reduction of rework, manually intensive processes and expensive monolithic assembly tooling. It is shown how MAA can enable rapid part-to-part assembly, increased use of flexible automation, traceable quality assurance and control, reduced structure weight and improved aerodynamic tolerances. These advances will require the development of automated networks of measurement instruments; model based thermal compensation, the automatic integration of 'live' measurement data into variation simulation and algorithms to generate cutting paths for predictive shimming and drilling processes. This paper sets out an architecture for digital systems which will enable this integrated approach to variation management. © 2013 The Authors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Large-scale mechanical products, such as aircraft and rockets, consist of large numbers of small components, which introduce additional difficulty for assembly accuracy and error estimation. Planar surfaces as key product characteristics are usually utilised for positioning small components in the assembly process. This paper focuses on assembly accuracy analysis of small components with planar surfaces in large-scale volume products. To evaluate the accuracy of the assembly system, an error propagation model for measurement error and fixture error is proposed, based on the assumption that all errors are normally distributed. In this model, the general coordinate vector is adopted to represent the position of the components. The error transmission functions are simplified into a linear model, and the coordinates of the reference points are composed by theoretical value and random error. The installation of a Head-Up Display is taken as an example to analyse the assembly error of small components based on the propagation model. The result shows that the final coordination accuracy is mainly determined by measurement error of the planar surface in small components. To reduce the uncertainty of the plane measurement, an evaluation index of measurement strategy is presented. This index reflects the distribution of the sampling point set and can be calculated by an inertia moment matrix. Finally, a practical application is introduced for validating the evaluation index.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A gazdasági tevékenységek folytatásához kapcsolódó jogi, etikai kérdések régóta foglalkoztatják mind az elméleti kutatókat, mind a gyakorlati szakembereket is. Az elfogadott normák időről időre változnak, az érintettek új dimenziókat emelnek be a vélemények, magatartási keretek kialakítása, formálódása során. Az új irányzatok, megközelítések integrálása azonban nem egyszerű sem az elméletalkotás, sem a gyakorlati megvalósítás szempontjából, mivel a már meglévő gondolati és magatartási struktúrák megváltoztatási kényszerével járnak. A CSR fogalma talán az eddigieknél jóval erőteljesebben honosodott meg az üzleti szaknyelvben, de definiálása, elméleti szintézise a kapcsolódó fogalmakkal és mindezek gyakorlati megvalósítása jelenleg még meglehetősen divergens. Az előadás során arra teszünk kísérletet, hogy áttekintsük a CSR szakirodalmi értelmezését és meghatározzuk a lehetséges kapcsolódási pontjait a jelenleg elfogadott marketing fogalmi struktúrákhoz. / === / Both theorists and practitioners have been interested in legal and ethical issues regarding business activities for a long time. The socially accepted norms are changing time by time; new dimensions are being revealed and becoming coherent parts of the constantly formulating views and frames of the expected behaviour. The integration of new streams and approaches however is not an easy process from the respect of both theorydevelopment and practical implementation, as it forces the transformation of the rigid cognitive and behavioural structures. Today CSR has become most widely internalised in business terminology but both its definition and its theoretical synthesis with relating concepts and the way how it is applied in practice have remained divergent. The presentation attempts to provide a literature review focusing on different explanations and determine the possible relationship with the currently accepted marketing theories and conceptual frameworks.