938 resultados para Research Tools
Resumo:
Project Focus: The main INFRAWEBS project focus and objective is the development of an application-oriented software toolset for creating, maintaining and executing WSMO-based Semantic Web Services (SWS) within their whole life cycle. This next generation of tools and systems will enable software and service providers to build open and extensible development platforms for web service applications. These services will run on open standards and specifications, such as BPEL4WS, WSMO, WSMX, WSML, SPARQL, RDF, etc. In particular, they will be compliant with WSMO (Web Services Modelling Ontology), a W3C initiative in Semantic Web services.
Resumo:
Full text: The idea of producing proteins from recombinant DNA hatched almost half a century ago. In his PhD thesis, Peter Lobban foresaw the prospect of inserting foreign DNA (from any source, including mammalian cells) into the genome of a λ phage in order to detect and recover protein products from Escherichia coli [ 1 and 2]. Only a few years later, in 1977, Herbert Boyer and his colleagues succeeded in the first ever expression of a peptide-coding gene in E. coli — they produced recombinant somatostatin [ 3] followed shortly after by human insulin. The field has advanced enormously since those early days and today recombinant proteins have become indispensable in advancing research and development in all fields of the life sciences. Structural biology, in particular, has benefitted tremendously from recombinant protein biotechnology, and an overwhelming proportion of the entries in the Protein Data Bank (PDB) are based on heterologously expressed proteins. Nonetheless, synthesizing, purifying and stabilizing recombinant proteins can still be thoroughly challenging. For example, the soluble proteome is organized to a large part into multicomponent complexes (in humans often comprising ten or more subunits), posing critical challenges for recombinant production. A third of all proteins in cells are located in the membrane, and pose special challenges that require a more bespoke approach. Recent advances may now mean that even these most recalcitrant of proteins could become tenable structural biology targets on a more routine basis. In this special issue, we examine progress in key areas that suggests this is indeed the case. Our first contribution examines the importance of understanding quality control in the host cell during recombinant protein production, and pays particular attention to the synthesis of recombinant membrane proteins. A major challenge faced by any host cell factory is the balance it must strike between its own requirements for growth and the fact that its cellular machinery has essentially been hijacked by an expression construct. In this context, Bill and von der Haar examine emerging insights into the role of the dependent pathways of translation and protein folding in defining high-yielding recombinant membrane protein production experiments for the common prokaryotic and eukaryotic expression hosts. Rather than acting as isolated entities, many membrane proteins form complexes to carry out their functions. To understand their biological mechanisms, it is essential to study the molecular structure of the intact membrane protein assemblies. Recombinant production of membrane protein complexes is still a formidable, at times insurmountable, challenge. In these cases, extraction from natural sources is the only option to prepare samples for structural and functional studies. Zorman and co-workers, in our second contribution, provide an overview of recent advances in the production of multi-subunit membrane protein complexes and highlight recent achievements in membrane protein structural research brought about by state-of-the-art near-atomic resolution cryo-electron microscopy techniques. E. coli has been the dominant host cell for recombinant protein production. Nonetheless, eukaryotic expression systems, including yeasts, insect cells and mammalian cells, are increasingly gaining prominence in the field. The yeast species Pichia pastoris, is a well-established recombinant expression system for a number of applications, including the production of a range of different membrane proteins. Byrne reviews high-resolution structures that have been determined using this methylotroph as an expression host. Although it is not yet clear why P. pastoris is suited to producing such a wide range of membrane proteins, its ease of use and the availability of diverse tools that can be readily implemented in standard bioscience laboratories mean that it is likely to become an increasingly popular option in structural biology pipelines. The contribution by Columbus concludes the membrane protein section of this volume. In her overview of post-expression strategies, Columbus surveys the four most common biochemical approaches for the structural investigation of membrane proteins. Limited proteolysis has successfully aided structure determination of membrane proteins in many cases. Deglycosylation of membrane proteins following production and purification analysis has also facilitated membrane protein structure analysis. Moreover, chemical modifications, such as lysine methylation and cysteine alkylation, have proven their worth to facilitate crystallization of membrane proteins, as well as NMR investigations of membrane protein conformational sampling. Together these approaches have greatly facilitated the structure determination of more than 40 membrane proteins to date. It may be an advantage to produce a target protein in mammalian cells, especially if authentic post-translational modifications such as glycosylation are required for proper activity. Chinese Hamster Ovary (CHO) cells and Human Embryonic Kidney (HEK) 293 cell lines have emerged as excellent hosts for heterologous production. The generation of stable cell-lines is often an aspiration for synthesizing proteins expressed in mammalian cells, in particular if high volumetric yields are to be achieved. In his report, Buessow surveys recent structures of proteins produced using stable mammalian cells and summarizes both well-established and novel approaches to facilitate stable cell-line generation for structural biology applications. The ambition of many biologists is to observe a protein's structure in the native environment of the cell itself. Until recently, this seemed to be more of a dream than a reality. Advances in nuclear magnetic resonance (NMR) spectroscopy techniques, however, have now made possible the observation of mechanistic events at the molecular level of protein structure. Smith and colleagues, in an exciting contribution, review emerging ‘in-cell NMR’ techniques that demonstrate the potential to monitor biological activities by NMR in real time in native physiological environments. A current drawback of NMR as a structure determination tool derives from size limitations of the molecule under investigation and the structures of large proteins and their complexes are therefore typically intractable by NMR. A solution to this challenge is the use of selective isotope labeling of the target protein, which results in a marked reduction of the complexity of NMR spectra and allows dynamic processes even in very large proteins and even ribosomes to be investigated. Kerfah and co-workers introduce methyl-specific isotopic labeling as a molecular tool-box, and review its applications to the solution NMR analysis of large proteins. Tyagi and Lemke next examine single-molecule FRET and crosslinking following the co-translational incorporation of non-canonical amino acids (ncAAs); the goal here is to move beyond static snap-shots of proteins and their complexes and to observe them as dynamic entities. The encoding of ncAAs through codon-suppression technology allows biomolecules to be investigated with diverse structural biology methods. In their article, Tyagi and Lemke discuss these approaches and speculate on the design of improved host organisms for ‘integrative structural biology research’. Our volume concludes with two contributions that resolve particular bottlenecks in the protein structure determination pipeline. The contribution by Crepin and co-workers introduces the concept of polyproteins in contemporary structural biology. Polyproteins are widespread in nature. They represent long polypeptide chains in which individual smaller proteins with different biological function are covalently linked together. Highly specific proteases then tailor the polyprotein into its constituent proteins. Many viruses use polyproteins as a means of organizing their proteome. The concept of polyproteins has now been exploited successfully to produce hitherto inaccessible recombinant protein complexes. For instance, by means of a self-processing synthetic polyprotein, the influenza polymerase, a high-value drug target that had remained elusive for decades, has been produced, and its high-resolution structure determined. In the contribution by Desmyter and co-workers, a further, often imposing, bottleneck in high-resolution protein structure determination is addressed: The requirement to form stable three-dimensional crystal lattices that diffract incident X-ray radiation to high resolution. Nanobodies have proven to be uniquely useful as crystallization chaperones, to coax challenging targets into suitable crystal lattices. Desmyter and co-workers review the generation of nanobodies by immunization, and highlight the application of this powerful technology to the crystallography of important protein specimens including G protein-coupled receptors (GPCRs). Recombinant protein production has come a long way since Peter Lobban's hypothesis in the late 1960s, with recombinant proteins now a dominant force in structural biology. The contributions in this volume showcase an impressive array of inventive approaches that are being developed and implemented, ever increasing the scope of recombinant technology to facilitate the determination of elusive protein structures. Powerful new methods from synthetic biology are further accelerating progress. Structure determination is now reaching into the living cell with the ultimate goal of observing functional molecular architectures in action in their native physiological environment. We anticipate that even the most challenging protein assemblies will be tackled by recombinant technology in the near future.
Resumo:
Micro Electro Mechanical Systems (MEMS) have already revolutionized several industries through miniaturization and cost effective manufacturing capabilities that were never possible before. However, commercially available MEMS products have only scratched the surface of the application areas where MEMS has potential. The complex and highly technical nature of MEMS research and development (R&D) combined with the lack of standards in areas such as design, fabrication and test methodologies, makes creating and supporting a MEMS R&D program a financial and technological challenge. A proper information technology (IT) infrastructure is the backbone of such research and is critical to its success. While the lack of standards and the general complexity in MEMS R&D makes it impossible to provide a “one size fits all” design, a systematic approach, combined with a good understanding of the MEMS R&D environment and the relevant computer-aided design tools, provides a way for the IT architect to develop an appropriate infrastructure.
Resumo:
The evaluation of geospatial data quality and trustworthiness presents a major challenge to geospatial data users when making a dataset selection decision. The research presented here therefore focused on defining and developing a GEO label – a decision support mechanism to assist data users in efficient and effective geospatial dataset selection on the basis of quality, trustworthiness and fitness for use. This thesis thus presents six phases of research and development conducted to: (a) identify the informational aspects upon which users rely when assessing geospatial dataset quality and trustworthiness; (2) elicit initial user views on the GEO label role in supporting dataset comparison and selection; (3) evaluate prototype label visualisations; (4) develop a Web service to support GEO label generation; (5) develop a prototype GEO label-based dataset discovery and intercomparison decision support tool; and (6) evaluate the prototype tool in a controlled human-subject study. The results of the studies revealed, and subsequently confirmed, eight geospatial data informational aspects that were considered important by users when evaluating geospatial dataset quality and trustworthiness, namely: producer information, producer comments, lineage information, compliance with standards, quantitative quality information, user feedback, expert reviews, and citations information. Following an iterative user-centred design (UCD) approach, it was established that the GEO label should visually summarise availability and allow interrogation of these key informational aspects. A Web service was developed to support generation of dynamic GEO label representations and integrated into a number of real-world GIS applications. The service was also utilised in the development of the GEO LINC tool – a GEO label-based dataset discovery and intercomparison decision support tool. The results of the final evaluation study indicated that (a) the GEO label effectively communicates the availability of dataset quality and trustworthiness information and (b) GEO LINC successfully facilitates ‘at a glance’ dataset intercomparison and fitness for purpose-based dataset selection.
Resumo:
Surface quality is important in engineering and a vital aspect of it is surface roughness, since it plays an important role in wear resistance, ductility, tensile, and fatigue strength for machined parts. This paper reports on a research study on the development of a geometrical model for surface roughness prediction when face milling with square inserts. The model is based on a geometrical analysis of the recreation of the tool trail left on the machined surface. The model has been validated with experimental data obtained for high speed milling of aluminum alloy (Al 7075-T7351) when using a wide range of cutting speed, feed per tooth, axial depth of cut and different values of tool nose radius (0.8. mm and 2.5. mm), using the Taguchi method as the design of experiments. The experimental roughness was obtained by measuring the surface roughness of the milled surfaces with a non-contact profilometer. The developed model can be used for any combination of material workpiece and tool, when tool flank wear is not considered and is suitable for using any tool diameter with any number of teeth and tool nose radius. The results show that the developed model achieved an excellent performance with almost 98% accuracy in terms of predicting the surface roughness when compared to the experimental data. © 2014 The Society of Manufacturing Engineers.
Resumo:
Report published in the Proceedings of the National Conference on "Education and Research in the Information Society", Plovdiv, May, 2014
Resumo:
Higher and further education institutions are increasingly using social software tools to support teaching and learning. A growing body of research investigates the diversity of tools and their range of contributions. However, little research has focused on investigating the role of the educator in the context of a social software initiative, even though the educator is critical for the introduction and successful use of social software in a course environment. Hence, we argue that research on social software should place greater emphasis on the educators, as their roles and activities (such as selecting the tools, developing the tasks and facilitating the student interactions on these tools) are instrumental to most aspects of a social software initiative. To this end, we have developed an agenda for future research on the role of the educator. Drawing on role theory, both as the basis for a systematic conceptualization of the educator role and as a guiding framework, we have developed a series of concrete research questions that address core issues associated with the educator roles in a social software context and provide recommendations for further investigations. By developing a research agenda we hope to stimulate research that creates a better understanding of the educator’s situation and develops guidelines to help educators carry out their social software initiatives. Considering the significant role an educator plays in the initiation and conduct of a social software initiative, our research agenda ultimately seeks to contribute to the adoption and efficient use of social software in the educational domain.
Resumo:
A tanulmány célja, hogy bemutassa a Magyarországon működő vállalatok gyakorlatát az ellátási lánc disztribúció oldalának menedzsmentje területén egy empirikus kutatás eredményeinek segítségével. A dolgozat két részből épül fel. Az első részben egy elméleti áttekintés olvasható azokról a menedzsment eszközökről, amelyeket a vállalatok disztribúciós folyamataik során alkalmazhatnak az ellátási láncban. A második rész az empirikus kutatás eredményeit mutatja be. A felmérés során 92 vállalat (amelyből az elemzésbe 79 volt ténylegesen bevonható) vett részt, és válaszaik és a statisztikai elemzés alapján kirajzolódik egy kép, hogy milyen mértékben alkalmazzák a disztribúciós lánc menedzsment eszközeit, valamint milyen fejlettségi szintek különböztethetők meg az alkalmazás volumene alapján. = Aim of the paper is to present the operational practice of Hungarian companies in managing the distribution side of the supply chain (the demand chain), with the help of the results of an empirical research. The paper consists of two parts. In the first part, a literature review is presented about the management tools which companies may use while managing their distribution processes in the supply chain. In the second part I introduce the results of the empirical research. The survey was participated by 92 companies (of which 79 could be analysed) and according to their responses and the statistical analyses, a picture was formulated about how intensely they use the demand chain management tools, how developed they are in the application of those.
Resumo:
A tanulmány a marketing szerteágazó területei és a vállalatok versenyképessége közötti összefüggéseket kereste és hasonlította össze az öt évvel ezelőtti felmérés eredményeivel. Az elemzés így kitért arra, hogy a vezetők hogyan észlelik a marketing szerepét a vállalati eredményesség szempontjából, hogyan hatnak a teljesítményre a termék- és márkázási döntések, a szolgáltatások menedzselése, valamint a reklámtevékenység. A kutatás érinti a marketing szervezeti megjelenését és a többi vállalati funkciókkal megfigyelhető kapcsolatát, majd az erőforrás-elmélet megközelítését felhasználva elemezte a marketing eszközök és képességek versenyképességre gyakorolt hatását. Az eredmények alapján azt állapíthatjuk meg, hogy a marketing gyakorlata számos ponton kapcsolódik a vállalati teljesítményhez, azonban előtérbe kerülnek azok a marketing jellegű képességek, amely a vállalat marketing rendszerének működtetéséhez, nyomon követéséhez és megújításához szükségesek. ____ The study aimed to reveal the association between the widespread functions of marketing and corporate competitiveness and it compared the results to the ones of the similar survey research conducted five years before. The analysis concerns the perceived role of marketing in the success companies and how product and brand decisions, the management of services or advertising practices can influence the performance of companies. The organisational representation of marketing and the relationship with other corporate functions were also investigated. Finally, the study implemented the approach of resource-based theory to determine the effects of marketing assets and capabilities on competitiveness. Based on the results we can conclude that several connections can be determined between marketing and corporate performance but the role of marketing related capabilities that are necessary for managing, tracing and developing marketing systems is increasing.
Resumo:
Knowledge on the expected effects of climate change on aquatic ecosystems is defined by three ways. On the one hand, long-term observation in the field serves as a basis for the possible changes; on the other hand, the experimental approach may bring valuable pieces of information to the research field. The expected effects of climate change cannot be studied by empirical approach; rather mathematical models are useful tools for this purpose. Within this study, the main findings of field observations and their implications for future were summarized; moreover, the modelling approaches were discussed in a more detailed way. Some models try to describe the variation of physical parameters in a given aquatic habitat, thus our knowledge on their biota is confined to the findings based on our present observations. Others are destined for answering special issues related to the given water body. Complex ecosystem models are the keys of our better understanding of the possible effects of climate change. Basically, these models were not created for testing the influence of global warming, rather focused on the description of a complex system (e. g. a lake) involving environmental variables, nutrients. However, such models are capable of studying climatic changes as well by taking into consideration a large set of environmental variables. Mostly, the outputs are consistent with the assumptions based on the findings in the field. Since synthetized models are rather difficult to handle and require quite large series of data, the authors proposed a more simple modelling approach, which is capable of examining the effects of global warming. This approach includes weather dependent simulation modelling of the seasonal dynamics of aquatic organisms within a simplified framework.
Resumo:
This study focuses on empirical investigations and seeks implications by utilizing three different methodologies to test various aspects of trader behavior. The first methodology utilizes Prospect Theory to determine trader behavior during periods of extreme wealth contracting periods. Secondly, a threshold model to examine the sentiment variable is formulated and thirdly a study is made of the contagion effect and trader behavior. ^ The connection between consumers' sense of financial well-being or sentiment and stock market performance has been studied at length. However, without data on actual versus experimental performance, implications based on this relationship are meaningless. The empirical agenda included examining a proprietary file of daily trader activities over a five-year period. Overall, during periods of extreme wealth altering conditions, traders "satisfice" rather than choose the "best" alternative. A trader's degree of loss aversion depends on his/her prior investment performance. A model that explains the behavior of traders during periods of turmoil is developed. Prospect Theory and the data file influenced the design of the model. ^ Additional research included testing a model that permitted the data to signal the crisis through a threshold model. The third empirical study sought to investigate the existence of contagion caused by declining global wealth effects using evidence from the mining industry in Canada. Contagion, where a financial crisis begins locally and subsequently spreads elsewhere, has been studied in terms of correlations among similar regions. The results provide support for Prospect Theory in two out of the three empirical studies. ^ The dissertation emphasizes the need for specifying precise, testable models of investors' expectations by providing tools to identify paradoxical behavior patterns. True enhancements in this field must include empirical research utilizing reliable data sources to mitigate data mining problems and allow researchers to distinguish between expectations-based and risk-based explanations of behavior. Through this type of research, it may be possible to systematically exploit "irrational" market behavior. ^
Resumo:
This study focuses on empirical investigations and seeks implications by utilizing three different methodologies to test various aspects of trader behavior. The first methodology utilizes Prospect Theory to determine trader behavior during periods of extreme wealth contracting periods. Secondly, a threshold model to examine the sentiment variable is formulated and thirdly a study is made of the contagion effect and trader behavior. The connection between consumers' sense of financial well-being or sentiment and stock market performance has been studied at length. However, without data on actual versus experimental performance, implications based on this relationship are meaningless. The empirical agenda included examining a proprietary file of daily trader activities over a five-year period. Overall, during periods of extreme wealth altering conditions, traders "satisfice" rather than choose the "best" alternative. A trader's degree of loss aversion depends on his/her prior investment performance. A model that explains the behavior of traders during periods of turmoil is developed. Prospect Theory and the data file influenced the design of the model. Additional research included testing a model that permitted the data to signal the crisis through a threshold model. The third empirical study sought to investigate the existence of contagion caused by declining global wealth effects using evidence from the mining industry in Canada. Contagion, where a financial crisis begins locally and subsequently spreads elsewhere, has been studied in terms of correlations among similar regions. The results provide support for Prospect Theory in two out of the three empirical studies. The dissertation emphasizes the need for specifying precise, testable models of investors' expectations by providing tools to identify paradoxical behavior patterns. True enhancements in this field must include empirical research utilizing reliable data sources to mitigate data mining problems and allow researchers to distinguish between expectations-based and risk-based explanations of behavior. Through this type of research, it may be possible to systematically exploit "irrational" market behavior.
Resumo:
Grade three students used tablet computers with a pre-selected series of applications over a seven-month period at school and through a community afterschool program. The study determined that these students benefited from differentiated learning in the school environment and online collaborative play in the afterschool centre. Benefits of the exposure to digital tools included: intergenerational learning as children assisted both parents and teachers with digital applications; problem-solving; and enhanced collaborative play for students across environments. Although this study makes a contribution to the field of digital literacy and young learners, the researchers conclude further investigation is warranted, in regards to the inter-relationships between home, school and community as spaces for the learning and teaching of digital technologies.
Resumo:
Acknowledgements University of Aberdeen, UK and Bay of Bengal Large Marine Ecosystems (BOBLME) project are acknowledged for partial funding of this research.