967 resultados para IT tools


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Precision agriculture (PA) describes a suite of IT based tools which allow farmers to electronically monitor soil and crop conditions and analyze treatment options. This study tests a model explaining the difficulties of PA technology adoption. The model draws on theories of technology acceptance and diffusion of innovation and is validated using survey data from farms in Canada. Findings highlight the importance of compatibility among PA technology components and the crucial role of farmers' expertise. The model provides the theoretical and empirical basis for developing policies and initiatives to support PA technology adoption.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to investigate the effectiveness of quality management training by reviewing commonly used critical success factors and tools rather than the overall methodological approach. Design/methodology/approach – The methodology used a web-based questionnaire. It consisted of 238 questions covering 77 tools and 30 critical success factors selected from leading academic and practitioner sources. The survey had 79 usable responses and the data were analysed using relevant statistical quality management tools. The results were validated in a series of structured workshops with quality management experts. Findings – Findings show that in general most of the critical success factor statements for quality management are agreed with, although not all are implemented well. The findings also show that many quality tools are not known or understood well; and that training has an important role in raising their awareness and making sure they are used correctly. Research limitations/implications – Generalisations are limited by the UK-centric nature of the sample. Practical implications – The practical implications are discussed for organisations implementing quality management initiatives, training organisations revising their quality management syllabi and academic institutions teaching quality management. Originality/value – Most recent surveys have been aimed at methodological levels (i.e. “lean”, “Six Sigma”, “total quality management” etc.); this research proposes that this has limited value as many of the tools and critical success factors are common to most of the methodologies. Therefore, quite uniquely, this research focuses on the tools and critical success factors. Additionally, other recent comparable surveys have been less comprehensive and not focused on training issues.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Successful commercialization of a technology such as Fiber Bragg Gratings requires the ability to manufacture devices repeatably, quickly and at low cost. Although the first report of photorefractive gratings was in 1978 it was not until 1993, when phase mask fabrication was demonstrated, that this became feasible. More recently, draw tower fabrication on a production level and grating writing through the polymer jacket have been realized; both important developments since they preserve the intrinsic strength of the fiber. Potentially the most significant recent development has been femtosecond laser inscription of gratings. Although not yet a commercial technology, it provides the means of writing multiple gratings in the optical core providing directional sensing capability in a single fiber. Femtosecond processing can also be used to machine the fiber to produce micronscale slots and holes enhancing the interaction between the light in the core and the surrounding medium. © 2011 Bentham Science Publishers Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several levels of complexity are available for modelling of wastewater treatment plants. Modelling local effects rely on computational fluid dynamics (CFD) approaches whereas activated sludge models (ASM) represent the global methodology. By applying both modelling approaches to pilot plant and full scale systems, this paper evaluates the value of each method and especially their potential combination. Model structure identification for ASM is discussed based on a full-scale closed loop oxidation ditch modelling. It is illustrated how and for what circumstances information obtained via CFD (computational fluid dynamics) analysis, residence time distribution (RTD) and other experimental means can be used. Furthermore, CFD analysis of the multiphase flow mechanisms is employed to obtain a correct description of the oxygenation capacity of the system studied, including an easy implementation of this information in the classical ASM modelling (e.g. oxygen transfer). The combination of CFD and activated sludge modelling of wastewater treatment processes is applied to three reactor configurations, a perfectly mixed reactor, a pilot scale activated sludge basin (ASB) and a real scale ASB. The application of the biological models to the CFD model is validated against experimentation for the pilot scale ASB and against a classical global ASM model response. A first step in the evaluation of the potential of the combined CFD-ASM model is performed using a full scale oxidation ditch system as testing scenario.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Prescribing support for paediatrics is diverse and includes both standard texts and electronic tools. Evidence concerning who should be supported and by what method is limited. This review aims to collate the current information available on prescribing support in paediatrics. Many tools designed to support prescribers are technology based. For example, electronic prescribing and smart phone applications. There is a focus on prescriber education both at undergraduate and postgraduate level. In the UK, the majority of inpatient prescribing is done by junior medical staff. It is important to ensure they are competent on qualification and supported in this role. A UK national prescribing assessment is being trialled to test for competence on graduation and there are also tools available to test paediatric prescribing after qualification. No information is available on the tools and resources UK prescribers currently use to support their decision making. One US study reported a decrease in the availability of paediatric prescribing information in a popular reference text. There is limited evidence to show that decisionsupport tools improve patient outcomes, however, there is growing confirmation that electronic prescribing reduces medication errors. There have been reports of new error types, such as selection errors, occurring with the use of electronic prescribing. Another concern with computerised decision-support systems is deciding what alerts should be presented to the prescriber and when/how often in order to avoid alert fatigue. There is little published concerning paediatric alerts perhaps as a consequence of commercial systems often not including paediatric specific support.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Given cybernetic idea is formed on the basis of neurophysiologic, neuropsychological, neurocybernetic data and verisimilar hypotheses, which fill gaps of formers, of the author as well. First of all attention is focused on general principles of a Memory organization in the brain and processes which take part in it that realize such psychical functions as perception and identification of input information about patterns and a problem solving, which is specified by the input and output conditions, as well. Realization of the second function, essentially cogitative, is discussed in the aspects of figurative and lingual thinking on the levels of intuition and understanding. The reasons of advisability and principles of bionic approach to creation of appropriate tools of artificial intelligent are proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Autonomic systems are required to adapt continually to changing environments and user goals. This process involves the real-Time update of the system's knowledge base, which should therefore be stored in a machine-readable format and automatically checked for consistency. OWL ontologies meet both requirements, as they represent collections of knowl- edge expressed in FIrst order logic, and feature embedded reasoners. To take advantage of these OWL ontology char- acteristics, this PhD project will devise a framework com- prising a theoretical foundation, tools and methods for de- veloping knowledge-centric autonomic systems. Within this framework, the knowledge storage and maintenance roles will be fulfilled by a specialised class of OWL ontologies. ©2014 ACM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Full text: The idea of producing proteins from recombinant DNA hatched almost half a century ago. In his PhD thesis, Peter Lobban foresaw the prospect of inserting foreign DNA (from any source, including mammalian cells) into the genome of a λ phage in order to detect and recover protein products from Escherichia coli [ 1 and 2]. Only a few years later, in 1977, Herbert Boyer and his colleagues succeeded in the first ever expression of a peptide-coding gene in E. coli — they produced recombinant somatostatin [ 3] followed shortly after by human insulin. The field has advanced enormously since those early days and today recombinant proteins have become indispensable in advancing research and development in all fields of the life sciences. Structural biology, in particular, has benefitted tremendously from recombinant protein biotechnology, and an overwhelming proportion of the entries in the Protein Data Bank (PDB) are based on heterologously expressed proteins. Nonetheless, synthesizing, purifying and stabilizing recombinant proteins can still be thoroughly challenging. For example, the soluble proteome is organized to a large part into multicomponent complexes (in humans often comprising ten or more subunits), posing critical challenges for recombinant production. A third of all proteins in cells are located in the membrane, and pose special challenges that require a more bespoke approach. Recent advances may now mean that even these most recalcitrant of proteins could become tenable structural biology targets on a more routine basis. In this special issue, we examine progress in key areas that suggests this is indeed the case. Our first contribution examines the importance of understanding quality control in the host cell during recombinant protein production, and pays particular attention to the synthesis of recombinant membrane proteins. A major challenge faced by any host cell factory is the balance it must strike between its own requirements for growth and the fact that its cellular machinery has essentially been hijacked by an expression construct. In this context, Bill and von der Haar examine emerging insights into the role of the dependent pathways of translation and protein folding in defining high-yielding recombinant membrane protein production experiments for the common prokaryotic and eukaryotic expression hosts. Rather than acting as isolated entities, many membrane proteins form complexes to carry out their functions. To understand their biological mechanisms, it is essential to study the molecular structure of the intact membrane protein assemblies. Recombinant production of membrane protein complexes is still a formidable, at times insurmountable, challenge. In these cases, extraction from natural sources is the only option to prepare samples for structural and functional studies. Zorman and co-workers, in our second contribution, provide an overview of recent advances in the production of multi-subunit membrane protein complexes and highlight recent achievements in membrane protein structural research brought about by state-of-the-art near-atomic resolution cryo-electron microscopy techniques. E. coli has been the dominant host cell for recombinant protein production. Nonetheless, eukaryotic expression systems, including yeasts, insect cells and mammalian cells, are increasingly gaining prominence in the field. The yeast species Pichia pastoris, is a well-established recombinant expression system for a number of applications, including the production of a range of different membrane proteins. Byrne reviews high-resolution structures that have been determined using this methylotroph as an expression host. Although it is not yet clear why P. pastoris is suited to producing such a wide range of membrane proteins, its ease of use and the availability of diverse tools that can be readily implemented in standard bioscience laboratories mean that it is likely to become an increasingly popular option in structural biology pipelines. The contribution by Columbus concludes the membrane protein section of this volume. In her overview of post-expression strategies, Columbus surveys the four most common biochemical approaches for the structural investigation of membrane proteins. Limited proteolysis has successfully aided structure determination of membrane proteins in many cases. Deglycosylation of membrane proteins following production and purification analysis has also facilitated membrane protein structure analysis. Moreover, chemical modifications, such as lysine methylation and cysteine alkylation, have proven their worth to facilitate crystallization of membrane proteins, as well as NMR investigations of membrane protein conformational sampling. Together these approaches have greatly facilitated the structure determination of more than 40 membrane proteins to date. It may be an advantage to produce a target protein in mammalian cells, especially if authentic post-translational modifications such as glycosylation are required for proper activity. Chinese Hamster Ovary (CHO) cells and Human Embryonic Kidney (HEK) 293 cell lines have emerged as excellent hosts for heterologous production. The generation of stable cell-lines is often an aspiration for synthesizing proteins expressed in mammalian cells, in particular if high volumetric yields are to be achieved. In his report, Buessow surveys recent structures of proteins produced using stable mammalian cells and summarizes both well-established and novel approaches to facilitate stable cell-line generation for structural biology applications. The ambition of many biologists is to observe a protein's structure in the native environment of the cell itself. Until recently, this seemed to be more of a dream than a reality. Advances in nuclear magnetic resonance (NMR) spectroscopy techniques, however, have now made possible the observation of mechanistic events at the molecular level of protein structure. Smith and colleagues, in an exciting contribution, review emerging ‘in-cell NMR’ techniques that demonstrate the potential to monitor biological activities by NMR in real time in native physiological environments. A current drawback of NMR as a structure determination tool derives from size limitations of the molecule under investigation and the structures of large proteins and their complexes are therefore typically intractable by NMR. A solution to this challenge is the use of selective isotope labeling of the target protein, which results in a marked reduction of the complexity of NMR spectra and allows dynamic processes even in very large proteins and even ribosomes to be investigated. Kerfah and co-workers introduce methyl-specific isotopic labeling as a molecular tool-box, and review its applications to the solution NMR analysis of large proteins. Tyagi and Lemke next examine single-molecule FRET and crosslinking following the co-translational incorporation of non-canonical amino acids (ncAAs); the goal here is to move beyond static snap-shots of proteins and their complexes and to observe them as dynamic entities. The encoding of ncAAs through codon-suppression technology allows biomolecules to be investigated with diverse structural biology methods. In their article, Tyagi and Lemke discuss these approaches and speculate on the design of improved host organisms for ‘integrative structural biology research’. Our volume concludes with two contributions that resolve particular bottlenecks in the protein structure determination pipeline. The contribution by Crepin and co-workers introduces the concept of polyproteins in contemporary structural biology. Polyproteins are widespread in nature. They represent long polypeptide chains in which individual smaller proteins with different biological function are covalently linked together. Highly specific proteases then tailor the polyprotein into its constituent proteins. Many viruses use polyproteins as a means of organizing their proteome. The concept of polyproteins has now been exploited successfully to produce hitherto inaccessible recombinant protein complexes. For instance, by means of a self-processing synthetic polyprotein, the influenza polymerase, a high-value drug target that had remained elusive for decades, has been produced, and its high-resolution structure determined. In the contribution by Desmyter and co-workers, a further, often imposing, bottleneck in high-resolution protein structure determination is addressed: The requirement to form stable three-dimensional crystal lattices that diffract incident X-ray radiation to high resolution. Nanobodies have proven to be uniquely useful as crystallization chaperones, to coax challenging targets into suitable crystal lattices. Desmyter and co-workers review the generation of nanobodies by immunization, and highlight the application of this powerful technology to the crystallography of important protein specimens including G protein-coupled receptors (GPCRs). Recombinant protein production has come a long way since Peter Lobban's hypothesis in the late 1960s, with recombinant proteins now a dominant force in structural biology. The contributions in this volume showcase an impressive array of inventive approaches that are being developed and implemented, ever increasing the scope of recombinant technology to facilitate the determination of elusive protein structures. Powerful new methods from synthetic biology are further accelerating progress. Structure determination is now reaching into the living cell with the ultimate goal of observing functional molecular architectures in action in their native physiological environment. We anticipate that even the most challenging protein assemblies will be tackled by recombinant technology in the near future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The evaluation of geospatial data quality and trustworthiness presents a major challenge to geospatial data users when making a dataset selection decision. The research presented here therefore focused on defining and developing a GEO label – a decision support mechanism to assist data users in efficient and effective geospatial dataset selection on the basis of quality, trustworthiness and fitness for use. This thesis thus presents six phases of research and development conducted to: (a) identify the informational aspects upon which users rely when assessing geospatial dataset quality and trustworthiness; (2) elicit initial user views on the GEO label role in supporting dataset comparison and selection; (3) evaluate prototype label visualisations; (4) develop a Web service to support GEO label generation; (5) develop a prototype GEO label-based dataset discovery and intercomparison decision support tool; and (6) evaluate the prototype tool in a controlled human-subject study. The results of the studies revealed, and subsequently confirmed, eight geospatial data informational aspects that were considered important by users when evaluating geospatial dataset quality and trustworthiness, namely: producer information, producer comments, lineage information, compliance with standards, quantitative quality information, user feedback, expert reviews, and citations information. Following an iterative user-centred design (UCD) approach, it was established that the GEO label should visually summarise availability and allow interrogation of these key informational aspects. A Web service was developed to support generation of dynamic GEO label representations and integrated into a number of real-world GIS applications. The service was also utilised in the development of the GEO LINC tool – a GEO label-based dataset discovery and intercomparison decision support tool. The results of the final evaluation study indicated that (a) the GEO label effectively communicates the availability of dataset quality and trustworthiness information and (b) GEO LINC successfully facilitates ‘at a glance’ dataset intercomparison and fitness for purpose-based dataset selection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Surface quality is important in engineering and a vital aspect of it is surface roughness, since it plays an important role in wear resistance, ductility, tensile, and fatigue strength for machined parts. This paper reports on a research study on the development of a geometrical model for surface roughness prediction when face milling with square inserts. The model is based on a geometrical analysis of the recreation of the tool trail left on the machined surface. The model has been validated with experimental data obtained for high speed milling of aluminum alloy (Al 7075-T7351) when using a wide range of cutting speed, feed per tooth, axial depth of cut and different values of tool nose radius (0.8. mm and 2.5. mm), using the Taguchi method as the design of experiments. The experimental roughness was obtained by measuring the surface roughness of the milled surfaces with a non-contact profilometer. The developed model can be used for any combination of material workpiece and tool, when tool flank wear is not considered and is suitable for using any tool diameter with any number of teeth and tool nose radius. The results show that the developed model achieved an excellent performance with almost 98% accuracy in terms of predicting the surface roughness when compared to the experimental data. © 2014 The Society of Manufacturing Engineers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Report published in the Proceedings of the National Conference on "Education and Research in the Information Society", Plovdiv, May, 2014

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Successful commercialization of a technology such as Fiber Bragg Gratings requires the ability to manufacture devices repeatably, quickly and at low cost. Although the first report of photorefractive gratings was in 1978 it was not until 1993, when phase mask fabrication was demonstrated, that this became feasible. More recently, draw tower fabrication on a production level and grating writing through the polymer jacket have been realized; both important developments since they preserve the intrinsic strength of the fiber. Potentially the most significant recent development has been femtosecond laser inscription of gratings. Although not yet a commercial technology, it provides the means of writing multiple gratings in the optical core providing directional sensing capability in a single fiber. Femtosecond processing can also be used to machine the fiber to produce micronscale slots and holes enhancing the interaction between the light in the core and the surrounding medium. © 2011 Bentham Science Publishers Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A versenyző együttműködés (co-opetition) népszerű fogalma számos elméleti alátámasztást nyert a versenyképesség elméleti megközelítéseiben is, így a regionális versenyképesség és a klaszterek elméleti alapvetéseinek területén. Maga a fogalom pedig kifejezetten „divatossá” vált hazánk turizmusában: a közelmúltban kerültek a versenyző együttműködést és a turisztikai versenyképességet szolgáló desztináció-menedzsment szervezetek (Turisztikai Desztináció Menedzsment Szervezetek, TDMSZ a továbbiakban) a turisztikai irányításpolitika és a turisztikai szakma figyelmének középpontjába. Jelen cikkben a szerző célja a versenyző együttműködés és a turisztikai desztinációk versenyképessége közötti elméleti összefüggés feltárása. További cél az elméleti alapvetések primer kutatás során való vizsgálata: az együttműködés mintáinak feltárása a desztinációk szereplői között három (egy hazai és két ausztriai) esettanulmány keretében, a hazai és nemzetközi esetek specifikumainak, valamint a desztinációk fejlettsége miatti kritikus különbségeknek a kimutatása által. / === / The popular term of co-opetition gained theoretical support even in the theoretical approaches of the area of competitiveness, particularly in the field of regional competitiveness and the clusters.The term became rather popular in Hungary: as the tools of destination management and the destination management organizations (DMOs) focusing on the development of co-opetition and competitiveness came to the focus of the Hungarian touristic practice, governmental decisions in the recent past. In this article the author’s aim is to describe the theoretical connections among co-opetition, and the competitiveness of the touristic destinations. Further aim is to analyze the theoretical baselines in primary research, to map the samples of cooperation among the actors of touristic destinations in three case studies (made in one Hungarian, and two Austrian destinations) regarding the topic with regard to Hungarian and international cases, as well as the stage of development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A közgazdaság-tudomány számos problémája a fizika analóg modelljeinek segítségével nyert megoldást. A közgazdászok körében erőteljesen megoszlanak a vélemények, hogy a közgazdasági modellek mennyire redukálhatók a fizika, vagy más természettudományok eredményeire. Vannak,akik pontosan ezzel magyarázzák,hogy a mai mainstream közgazdasági elmélet átalakult alkalmazott matematikává,ami a gazdasági kérdéseket csak a társadalom-tudományi vonatkozásaitól eltekintve képes vizsgálni. Mások, e tanulmányszerzője is, viszont úgy vélekednek, hogy a közgazdasági problémák egy része, ahol lehetőség van a mérésre, jól modellezhetők a természettudományok technikai arzenáljával. A másik része, amelyekben nem lehet mérni,s tipikusan ilyenek a társadalomtudományi kérdések, ott sokkal komplexebb technikákra lesz szükség. Etanulmány célkitűzése, hogy felvázolja a fizika legújabb, az irreverzibilis dinamika, a relativitáselmélet és a kvantummechanika sztochasztikus matematikai összefüggéseit, amelyekből a közgazdászok választhatnak egy-egy probléma megfogalmazásában és megoldásában. Például az időoperátorok pontos értelmezése jelentős fordulatot hozhat a makroökonómiai elméletekben; vagy az eddigi statikus egyensúlyi referencia pontokat felválthatják a dinamikus,időben változó sztochasztikus egyensúlyi referenciafüggvények, ami forradalmian új megvilágításba helyezhet számos társadalomtudományi, s főleg nemegyensúlyi közgazdasági kérdést.A termodinamika és a biológiai evolúció fogalmait és definícióit Paul A. Samuelson (1947) már adaptálta a közgazdaságtanban, viszont a kvantummechanika legújabb eredményeit, az időoperátorokat stb. nem érintette. E cikk azokat a legújabb fizikai, kémiai és biológiai matematikai összefüggéseket foglalja össze,amelyek hasznosak lehetnek a közgazdasági modellek komplexebb megfogalmazásához. ___________________ The aim of this paper is to out line the newest results of physics,i.e.,the stochastic mathematical relations of relativity theory and quantum mechanics as well as irreversible dynamics which can be applied for some economic problems.For example,the correct interpretation of time operators using for the macroeconomic theories may provide a serious improvement in approach to the reality.The stochastic dynamic equilibrium reference functions will take over the role of recent static equilibrium reference points,which may also reveal some nonequilibrium questions of macroeconomics.The concepts and definitions of thermodynamics and biological evolution have been adopted in economics by Paul A. Samuelson, but he did not concern the newest results of quantum mechanics, e.g., the time operators. Now we do it.In addition, following Samuelson,we show that von Neumann growth model cannot be explained as a peculiar extension of thermodynamic irreversibility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A tanulmány a marketing szerteágazó területei és a vállalatok versenyképessége közötti összefüggéseket kereste és hasonlította össze az öt évvel ezelőtti felmérés eredményeivel. Az elemzés így kitért arra, hogy a vezetők hogyan észlelik a marketing szerepét a vállalati eredményesség szempontjából, hogyan hatnak a teljesítményre a termék- és márkázási döntések, a szolgáltatások menedzselése, valamint a reklámtevékenység. A kutatás érinti a marketing szervezeti megjelenését és a többi vállalati funkciókkal megfigyelhető kapcsolatát, majd az erőforrás-elmélet megközelítését felhasználva elemezte a marketing eszközök és képességek versenyképességre gyakorolt hatását. Az eredmények alapján azt állapíthatjuk meg, hogy a marketing gyakorlata számos ponton kapcsolódik a vállalati teljesítményhez, azonban előtérbe kerülnek azok a marketing jellegű képességek, amely a vállalat marketing rendszerének működtetéséhez, nyomon követéséhez és megújításához szükségesek. ____ The study aimed to reveal the association between the widespread functions of marketing and corporate competitiveness and it compared the results to the ones of the similar survey research conducted five years before. The analysis concerns the perceived role of marketing in the success companies and how product and brand decisions, the management of services or advertising practices can influence the performance of companies. The organisational representation of marketing and the relationship with other corporate functions were also investigated. Finally, the study implemented the approach of resource-based theory to determine the effects of marketing assets and capabilities on competitiveness. Based on the results we can conclude that several connections can be determined between marketing and corporate performance but the role of marketing related capabilities that are necessary for managing, tracing and developing marketing systems is increasing.