935 resultados para Analysis Tools


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis provides a set of tools for managing uncertainty in Web-based models and workflows.To support the use of these tools, this thesis firstly provides a framework for exposing models through Web services. An introduction to uncertainty management, Web service interfaces,and workflow standards and technologies is given, with a particular focus on the geospatial domain.An existing specification for exposing geospatial models and processes, theWeb Processing Service (WPS), is critically reviewed. A processing service framework is presented as a solutionto usability issues with the WPS standard. The framework implements support for Simple ObjectAccess Protocol (SOAP), Web Service Description Language (WSDL) and JavaScript Object Notation (JSON), allowing models to be consumed by a variety of tools and software. Strategies for communicating with models from Web service interfaces are discussed, demonstrating the difficultly of exposing existing models on the Web. This thesis then reviews existing mechanisms for uncertainty management, with an emphasis on emulator methods for building efficient statistical surrogate models. A tool is developed to solve accessibility issues with such methods, by providing a Web-based user interface and backend to ease the process of building and integrating emulators. These tools, plus the processing service framework, are applied to a real case study as part of the UncertWeb project. The usability of the framework is proved with the implementation of aWeb-based workflow for predicting future crop yields in the UK, also demonstrating the abilities of the tools for emulator building and integration. Future directions for the development of the tools are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Microfluidics has recently emerged as a new method of manufacturing liposomes, which allows for reproducible mixing in miliseconds on the nanoliter scale. Here we investigate microfluidics-based manufacturing of liposomes. The aim of these studies was to assess the parameters in a microfluidic process by varying the total flow rate (TFR) and the flow rate ratio (FRR) of the solvent and aqueous phases. Design of experiment and multivariate data analysis were used for increased process understanding and development of predictive and correlative models. High FRR lead to the bottom-up synthesis of liposomes, with a strong correlation with vesicle size, demonstrating the ability to in-process control liposomes size; the resulting liposome size correlated with the FRR in the microfluidics process, with liposomes of 50 nm being reproducibly manufactured. Furthermore, we demonstrate the potential of a high throughput manufacturing of liposomes using microfluidics with a four-fold increase in the volumetric flow rate, maintaining liposome characteristics. The efficacy of these liposomes was demonstrated in transfection studies and was modelled using predictive modeling. Mathematical modelling identified FRR as the key variable in the microfluidic process, with the highest impact on liposome size, polydispersity and transfection efficiency. This study demonstrates microfluidics as a robust and high-throughput method for the scalable and highly reproducible manufacture of size-controlled liposomes. Furthermore, the application of statistically based process control increases understanding and allows for the generation of a design-space for controlled particle characteristics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several levels of complexity are available for modelling of wastewater treatment plants. Modelling local effects rely on computational fluid dynamics (CFD) approaches whereas activated sludge models (ASM) represent the global methodology. By applying both modelling approaches to pilot plant and full scale systems, this paper evaluates the value of each method and especially their potential combination. Model structure identification for ASM is discussed based on a full-scale closed loop oxidation ditch modelling. It is illustrated how and for what circumstances information obtained via CFD (computational fluid dynamics) analysis, residence time distribution (RTD) and other experimental means can be used. Furthermore, CFD analysis of the multiphase flow mechanisms is employed to obtain a correct description of the oxygenation capacity of the system studied, including an easy implementation of this information in the classical ASM modelling (e.g. oxygen transfer). The combination of CFD and activated sludge modelling of wastewater treatment processes is applied to three reactor configurations, a perfectly mixed reactor, a pilot scale activated sludge basin (ASB) and a real scale ASB. The application of the biological models to the CFD model is validated against experimentation for the pilot scale ASB and against a classical global ASM model response. A first step in the evaluation of the potential of the combined CFD-ASM model is performed using a full scale oxidation ditch system as testing scenario.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Developers of interactive software are confronted by an increasing variety of software tools to help engineer the interactive aspects of software applications. Not only do these tools fall into different categories in terms of functionality, but within each category there is a growing number of competing tools with similar, although not identical, features. Choice of user interface development tool (UIDT) is therefore becoming increasingly complex.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

JPEG2000 is a new coming image standard. In this paper we analyze the performance of error resilience tools in JPEG2000, and present an analytical model to estimate the quality of JPEG2000 encoded image transmitted over wireless channels. The effectiveness of the analytical model is validated by simulation results. Furthermore, analytical model is utilized by the base station to design efficient unequally error protection schemes for JPEG2000 transmission. In the design, a utility function is denned to make a tradeoff between the image quality and the cost for transmitting the image over wireless channel. © 2002 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Full text: The idea of producing proteins from recombinant DNA hatched almost half a century ago. In his PhD thesis, Peter Lobban foresaw the prospect of inserting foreign DNA (from any source, including mammalian cells) into the genome of a λ phage in order to detect and recover protein products from Escherichia coli [ 1 and 2]. Only a few years later, in 1977, Herbert Boyer and his colleagues succeeded in the first ever expression of a peptide-coding gene in E. coli — they produced recombinant somatostatin [ 3] followed shortly after by human insulin. The field has advanced enormously since those early days and today recombinant proteins have become indispensable in advancing research and development in all fields of the life sciences. Structural biology, in particular, has benefitted tremendously from recombinant protein biotechnology, and an overwhelming proportion of the entries in the Protein Data Bank (PDB) are based on heterologously expressed proteins. Nonetheless, synthesizing, purifying and stabilizing recombinant proteins can still be thoroughly challenging. For example, the soluble proteome is organized to a large part into multicomponent complexes (in humans often comprising ten or more subunits), posing critical challenges for recombinant production. A third of all proteins in cells are located in the membrane, and pose special challenges that require a more bespoke approach. Recent advances may now mean that even these most recalcitrant of proteins could become tenable structural biology targets on a more routine basis. In this special issue, we examine progress in key areas that suggests this is indeed the case. Our first contribution examines the importance of understanding quality control in the host cell during recombinant protein production, and pays particular attention to the synthesis of recombinant membrane proteins. A major challenge faced by any host cell factory is the balance it must strike between its own requirements for growth and the fact that its cellular machinery has essentially been hijacked by an expression construct. In this context, Bill and von der Haar examine emerging insights into the role of the dependent pathways of translation and protein folding in defining high-yielding recombinant membrane protein production experiments for the common prokaryotic and eukaryotic expression hosts. Rather than acting as isolated entities, many membrane proteins form complexes to carry out their functions. To understand their biological mechanisms, it is essential to study the molecular structure of the intact membrane protein assemblies. Recombinant production of membrane protein complexes is still a formidable, at times insurmountable, challenge. In these cases, extraction from natural sources is the only option to prepare samples for structural and functional studies. Zorman and co-workers, in our second contribution, provide an overview of recent advances in the production of multi-subunit membrane protein complexes and highlight recent achievements in membrane protein structural research brought about by state-of-the-art near-atomic resolution cryo-electron microscopy techniques. E. coli has been the dominant host cell for recombinant protein production. Nonetheless, eukaryotic expression systems, including yeasts, insect cells and mammalian cells, are increasingly gaining prominence in the field. The yeast species Pichia pastoris, is a well-established recombinant expression system for a number of applications, including the production of a range of different membrane proteins. Byrne reviews high-resolution structures that have been determined using this methylotroph as an expression host. Although it is not yet clear why P. pastoris is suited to producing such a wide range of membrane proteins, its ease of use and the availability of diverse tools that can be readily implemented in standard bioscience laboratories mean that it is likely to become an increasingly popular option in structural biology pipelines. The contribution by Columbus concludes the membrane protein section of this volume. In her overview of post-expression strategies, Columbus surveys the four most common biochemical approaches for the structural investigation of membrane proteins. Limited proteolysis has successfully aided structure determination of membrane proteins in many cases. Deglycosylation of membrane proteins following production and purification analysis has also facilitated membrane protein structure analysis. Moreover, chemical modifications, such as lysine methylation and cysteine alkylation, have proven their worth to facilitate crystallization of membrane proteins, as well as NMR investigations of membrane protein conformational sampling. Together these approaches have greatly facilitated the structure determination of more than 40 membrane proteins to date. It may be an advantage to produce a target protein in mammalian cells, especially if authentic post-translational modifications such as glycosylation are required for proper activity. Chinese Hamster Ovary (CHO) cells and Human Embryonic Kidney (HEK) 293 cell lines have emerged as excellent hosts for heterologous production. The generation of stable cell-lines is often an aspiration for synthesizing proteins expressed in mammalian cells, in particular if high volumetric yields are to be achieved. In his report, Buessow surveys recent structures of proteins produced using stable mammalian cells and summarizes both well-established and novel approaches to facilitate stable cell-line generation for structural biology applications. The ambition of many biologists is to observe a protein's structure in the native environment of the cell itself. Until recently, this seemed to be more of a dream than a reality. Advances in nuclear magnetic resonance (NMR) spectroscopy techniques, however, have now made possible the observation of mechanistic events at the molecular level of protein structure. Smith and colleagues, in an exciting contribution, review emerging ‘in-cell NMR’ techniques that demonstrate the potential to monitor biological activities by NMR in real time in native physiological environments. A current drawback of NMR as a structure determination tool derives from size limitations of the molecule under investigation and the structures of large proteins and their complexes are therefore typically intractable by NMR. A solution to this challenge is the use of selective isotope labeling of the target protein, which results in a marked reduction of the complexity of NMR spectra and allows dynamic processes even in very large proteins and even ribosomes to be investigated. Kerfah and co-workers introduce methyl-specific isotopic labeling as a molecular tool-box, and review its applications to the solution NMR analysis of large proteins. Tyagi and Lemke next examine single-molecule FRET and crosslinking following the co-translational incorporation of non-canonical amino acids (ncAAs); the goal here is to move beyond static snap-shots of proteins and their complexes and to observe them as dynamic entities. The encoding of ncAAs through codon-suppression technology allows biomolecules to be investigated with diverse structural biology methods. In their article, Tyagi and Lemke discuss these approaches and speculate on the design of improved host organisms for ‘integrative structural biology research’. Our volume concludes with two contributions that resolve particular bottlenecks in the protein structure determination pipeline. The contribution by Crepin and co-workers introduces the concept of polyproteins in contemporary structural biology. Polyproteins are widespread in nature. They represent long polypeptide chains in which individual smaller proteins with different biological function are covalently linked together. Highly specific proteases then tailor the polyprotein into its constituent proteins. Many viruses use polyproteins as a means of organizing their proteome. The concept of polyproteins has now been exploited successfully to produce hitherto inaccessible recombinant protein complexes. For instance, by means of a self-processing synthetic polyprotein, the influenza polymerase, a high-value drug target that had remained elusive for decades, has been produced, and its high-resolution structure determined. In the contribution by Desmyter and co-workers, a further, often imposing, bottleneck in high-resolution protein structure determination is addressed: The requirement to form stable three-dimensional crystal lattices that diffract incident X-ray radiation to high resolution. Nanobodies have proven to be uniquely useful as crystallization chaperones, to coax challenging targets into suitable crystal lattices. Desmyter and co-workers review the generation of nanobodies by immunization, and highlight the application of this powerful technology to the crystallography of important protein specimens including G protein-coupled receptors (GPCRs). Recombinant protein production has come a long way since Peter Lobban's hypothesis in the late 1960s, with recombinant proteins now a dominant force in structural biology. The contributions in this volume showcase an impressive array of inventive approaches that are being developed and implemented, ever increasing the scope of recombinant technology to facilitate the determination of elusive protein structures. Powerful new methods from synthetic biology are further accelerating progress. Structure determination is now reaching into the living cell with the ultimate goal of observing functional molecular architectures in action in their native physiological environment. We anticipate that even the most challenging protein assemblies will be tackled by recombinant technology in the near future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research in social psychology has shown that public attitudes towards feminism are mostly based on stereotypical views linking feminism with leftist politics and lesbian orientation. It is claimed that such attitudes are due to the negative and sexualised media construction of feminism. Studies concerned with the media representation of feminism seem to confirm this tendency. While most of this research provides significant insights into the representation of feminism, the findings are often based on a small sample of texts. Also, most of the research was conducted in an Anglo-American setting. This study attempts to address some of the shortcomings of previous work by examining the discourse of feminism in a large corpus of German and British newspaper data. It does so by employing the tools of Corpus Linguistics. By investigating the collocation profiles of the search term feminism, we provide evidence of salient discourse patterns surrounding feminism in two different cultural contexts. © The Author(s) 2012.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Surface quality is important in engineering and a vital aspect of it is surface roughness, since it plays an important role in wear resistance, ductility, tensile, and fatigue strength for machined parts. This paper reports on a research study on the development of a geometrical model for surface roughness prediction when face milling with square inserts. The model is based on a geometrical analysis of the recreation of the tool trail left on the machined surface. The model has been validated with experimental data obtained for high speed milling of aluminum alloy (Al 7075-T7351) when using a wide range of cutting speed, feed per tooth, axial depth of cut and different values of tool nose radius (0.8. mm and 2.5. mm), using the Taguchi method as the design of experiments. The experimental roughness was obtained by measuring the surface roughness of the milled surfaces with a non-contact profilometer. The developed model can be used for any combination of material workpiece and tool, when tool flank wear is not considered and is suitable for using any tool diameter with any number of teeth and tool nose radius. The results show that the developed model achieved an excellent performance with almost 98% accuracy in terms of predicting the surface roughness when compared to the experimental data. © 2014 The Society of Manufacturing Engineers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The uncertainty of measurements must be quantified and considered in order to prove conformance with specifications and make other meaningful comparisons based on measurements. While there is a consistent methodology for the evaluation and expression of uncertainty within the metrology community industry frequently uses the alternative Measurement Systems Analysis methodology. This paper sets out to clarify the differences between uncertainty evaluation and MSA and presents a novel hybrid methodology for industrial measurement which enables a correct evaluation of measurement uncertainty while utilising the practical tools of MSA. In particular the use of Gage R&R ANOVA and Attribute Gage studies within a wider uncertainty evaluation framework is described. This enables in-line measurement data to be used to establish repeatability and reproducibility, without time consuming repeatability studies being carried out, while maintaining a complete consideration of all sources of uncertainty and therefore enabling conformance to be proven with a stated level of confidence. Such a rigorous approach to product verification will become increasingly important in the era of the Light Controlled Factory with metrology acting as the driving force to achieve the right first time and highly automated manufacture of high value large scale products such as aircraft, spacecraft and renewable power generation structures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the development of social media tools such as Facebook and Twitter, mainstream media organizations including newspapers and TV media have played an active role in engaging with their audience and strengthening their influence on the recently emerged platforms. In this paper, we analyze the behavior of mainstream media on Twitter and study how they exert their influence to shape public opinion during the UK's 2010 General Election. We first propose an empirical measure to quantify mainstream media bias based on sentiment analysis and show that it correlates better with the actual political bias in the UK media than the pure quantitative measures based on media coverage of various political parties. We then compare the information diffusion patterns from different categories of sources. We found that while mainstream media is good at seeding prominent information cascades, its role in shaping public opinion is being challenged by journalists since tweets from them are more likely to be retweeted and they spread faster and have longer lifespan compared to tweets from mainstream media. Moreover, the political bias of the journalists is a good indicator of the actual election results. Copyright 2013 ACM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A two degrees of freedom (2-DOF) actuator capable of producing linear translation, rotary motion, or helical motion would be a desirable asset to the fields of machine tools, robotics, and various apparatuses. In this paper, a novel 2-DOF split-stator induction motor was proposed and electromagnetic structure pa- rameters of the motor were designed and optimized. The feature of the direct-drive 2-DOF induction motor lies in its solid mover ar- rangement. In order to study the complex distribution of the eddy current field on the ferromagnetic cylinder mover and the motor’s operating characteristics, the mathematical model of the proposed motor was established, and characteristics of the motor were ana- lyzed by adopting the permeation depth method (PDM) and finite element method (FEM). The analytical and numerical results from motor simulation clearly show a correlation between the PDM and FEM models. This may be considered as a fair justification for the proposed machine and design tools.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The extant literature on workplace coaching is characterised by a lack of theoretical and empirical understanding regarding the effectiveness of coaching as a learning and development tool; the types of outcomes one can expect from coaching; the tools that can be used to measure coaching outcomes; the underlying processes that explain why and how coaching works and the factors that may impact on coaching effectiveness. This thesis sought to address these substantial gaps in the literature with three linked studies. Firstly, a meta-analysis of workplace coaching effectiveness (k = 17), synthesizing the existing research was presented. A framework of coaching outcomes was developed and utilised to code the studies. Analysis indicated that coaching had positive effects on all outcomes. Next, the framework of outcomes was utilised as the deductive start-point to the development of the scale measuring perceived coaching effectiveness. Utilising a multi-stage approach (n = 201), the analysis indicated that perceived coaching effectiveness may be organised into a six factor structure: career clarity; team performance; work well-being; performance; planning and organizing and personal effectiveness and adaptability. The final study was a longitudinal field experiment to test a theoretical model of individual differences and coaching effectiveness developed in this thesis. An organizational sample of 84 employees each participated in a coaching intervention, completed self-report surveys, and had their job performance rated by peers, direct reports and supervisors (a total of 352 employees provided data on participant performance). The results demonstrate that compared to a control group, the coaching intervention generated a number of positive outcomes. The analysis indicated that coachees’ enthusiasm, intellect and orderliness influenced the impact of coaching on outcomes. Mediation analysis suggested that mastery goal orientation, performance goal orientation and approach motivation in the form of behavioural activation system (BAS) drive, were significant mediators between personality and outcomes. Overall, the findings of this thesis make an original contribution to the understanding of the types of outcomes that can be expected from coaching, and the magnitude of impact coaching has on outcomes. The thesis also provides a tool for reliably measuring coaching effectiveness and a theoretical model to understand the influence of coachee individual differences on coaching outcomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A cikkben a kooperatív játékelmélet fogalmait alkalmazzuk egy ellátási lánc esetében. Az ostorcsapás-hatás elemeit egy beszállító-termelő ellátási láncban ragadjuk meg egy Arrow-Karlin típusú modellben lineáris készletezési és konvex termelési költség mellett. Feltételezzük, hogy mindkét vállalat minimalizálja a fontosabb költségeit. Két működési rendszert hasonlítunk össze: egy hierarchikus döntéshozatali rendszert, amikor először a termelő, majd a beszállító optimalizálja helyzetét, majd egy centralizált (kooperatív) modellt, amikor a vállalatok az együttes költségüket minimalizálják. A kérdés úgy merül fel, hogy a csökkentett ostorcsapás-hatás esetén hogyan osszák meg a részvevők ebben a transzferálható hasznosságú kooperatív játékban. = In this paper we apply cooperative game theory concepts to analyze supply chains. The bullwhip effect in a two-stage supply chain (supplier-manufacturer) in the framework of the Arrow-Karlin model with linear-convex cost functions is considered. It is assumed that both firms minimize their relevant costs, and two cases are examined: the supplier and the manufacturer minimize their relevant costs in a decentralized and in a centralized (cooperative) way. The question of how to share the savings of the decreased bullwhip effect in the centralized (cooperative) model is answered by transferable utility cooperative game theory tools.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A tanulmány a marketing szerteágazó területei és a vállalatok versenyképessége közötti összefüggéseket kereste és hasonlította össze az öt évvel ezelőtti felmérés eredményeivel. Az elemzés így kitért arra, hogy a vezetők hogyan észlelik a marketing szerepét a vállalati eredményesség szempontjából, hogyan hatnak a teljesítményre a termék- és márkázási döntések, a szolgáltatások menedzselése, valamint a reklámtevékenység. A kutatás érinti a marketing szervezeti megjelenését és a többi vállalati funkciókkal megfigyelhető kapcsolatát, majd az erőforrás-elmélet megközelítését felhasználva elemezte a marketing eszközök és képességek versenyképességre gyakorolt hatását. Az eredmények alapján azt állapíthatjuk meg, hogy a marketing gyakorlata számos ponton kapcsolódik a vállalati teljesítményhez, azonban előtérbe kerülnek azok a marketing jellegű képességek, amely a vállalat marketing rendszerének működtetéséhez, nyomon követéséhez és megújításához szükségesek. ____ The study aimed to reveal the association between the widespread functions of marketing and corporate competitiveness and it compared the results to the ones of the similar survey research conducted five years before. The analysis concerns the perceived role of marketing in the success companies and how product and brand decisions, the management of services or advertising practices can influence the performance of companies. The organisational representation of marketing and the relationship with other corporate functions were also investigated. Finally, the study implemented the approach of resource-based theory to determine the effects of marketing assets and capabilities on competitiveness. Based on the results we can conclude that several connections can be determined between marketing and corporate performance but the role of marketing related capabilities that are necessary for managing, tracing and developing marketing systems is increasing.