926 resultados para Production technology


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The telescopic conversion of glucose to fructose and then 5-hydroxymethylfurfural (5-HMF), the latter a potential, bio-derived platform chemical feedstock, has been explored over a family of bifunctional sulfated zirconia catalysts possessing tuneable acid-base properties. Characterisation by acid-base titration, XPS, XRD and Raman reveal that submonolayer SO4 coverages offer the ideal balance of basic and Lewis-Brønsted acid sites required to respectively isomerise glucose to fructose, and subsequently dehydrate fructose to 5-HMF. A constant acid site normalised turnover frequency is observed for fructose dehydration to 5-HMF, confirming a common Brønsted acid site is responsible for this transformation. This journal is © The Royal Society of Chemistry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Photodeposition of H2PtCl6 in the presence of methanol promotes the formation of highly dispersed, metallic Pt nanoparticles over titania, likely via capture of photogenerated holes by the alcohol to produce an excess of surface electrons for substrate-mediated transfer to Pt complexes, resulting in a high density of surface nucleation sites for Pt reduction. Photocatalytic hydrogen production from water is proportional to the surface density of Pt metal co-catalyst, and hence photodeposition in the presence of high methanol concentrations affords a facile route to optimising photocatalyst design and highlights the importance of tuning co-catalyst properties in photocatalysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an assessment of the technical and economic performance of thermal processes to generate electricity from a wood chip feedstock by combustion, gasification and fast pyrolysis. The scope of the work begins with the delivery of a wood chip feedstock at a conversion plant and ends with the supply of electricity to the grid, incorporating wood chip preparation, thermal conversion, and electricity generation in dual fuel diesel engines. Net generating capacities of 1–20 MWe are evaluated. The techno-economic assessment is achieved through the development of a suite of models that are combined to give cost and performance data for the integrated system. The models include feed pretreatment, combustion, atmospheric and pressure gasification, fast pyrolysis with pyrolysis liquid storage and transport (an optional step in de-coupled systems) and diesel engine or turbine power generation. The models calculate system efficiencies, capital costs and production costs. An identical methodology is applied in the development of all the models so that all of the results are directly comparable. The electricity production costs have been calculated for 10th plant systems, indicating the costs that are achievable in the medium term after the high initial costs associated with novel technologies have reduced. The costs converge at the larger scale with the mean electricity price paid in the EU by a large consumer, and there is therefore potential for fast pyrolysis and diesel engine systems to sell electricity directly to large consumers or for on-site generation. However, competition will be fierce at all capacities since electricity production costs vary only slightly between the four biomass to electricity systems that are evaluated. Systems de-coupling is one way that the fast pyrolysis and diesel engine system can distinguish itself from the other conversion technologies. Evaluations in this work show that situations requiring several remote generators are much better served by a large fast pyrolysis plant that supplies fuel to de-coupled diesel engines than by constructing an entire close-coupled system at each generating site. Another advantage of de-coupling is that the fast pyrolysis conversion step and the diesel engine generation step can operate independently, with intermediate storage of the fast pyrolysis liquid fuel, increasing overall reliability. Peak load or seasonal power requirements would also benefit from de-coupling since a small fast pyrolysis plant could operate continuously to produce fuel that is stored for use in the engine on demand. Current electricity production costs for a fast pyrolysis and diesel engine system are 0.091/kWh at 1 MWe when learning effects are included. These systems are handicapped by the typical characteristics of a novel technology: high capital cost, high labour, and low reliability. As such the more established combustion and steam cycle produces lower cost electricity under current conditions. The fast pyrolysis and diesel engine system is a low capital cost option but it also suffers from relatively low system efficiency particularly at high capacities. This low efficiency is the result of a low conversion efficiency of feed energy into the pyrolysis liquid, because of the energy in the char by-product. A sensitivity analysis has highlighted the high impact on electricity production costs of the fast pyrolysis liquids yield. The liquids yield should be set realistically during design, and it should be maintained in practice by careful attention to plant operation and feed quality. Another problem is the high power consumption during feedstock grinding. Efficiencies may be enhanced in ablative fast pyrolysis which can tolerate a chipped feedstock. This has yet to be demonstrated at commercial scale. In summary, the fast pyrolysis and diesel engine system has great potential to generate electricity at a profit in the long term, and at a lower cost than any other biomass to electricity system at small scale. This future viability can only be achieved through the construction of early plant that could, in the short term, be more expensive than the combustion alternative. Profitability in the short term can best be achieved by exploiting niches in the market place and specific features of fast pyrolysis. These include: •countries or regions with fiscal incentives for renewable energy such as premium electricity prices or capital grants; •locations with high electricity prices so that electricity can be sold direct to large consumers or generated on-site by companies who wish to reduce their consumption from the grid; •waste disposal opportunities where feedstocks can attract a gate fee rather than incur a cost; •the ability to store fast pyrolysis liquids as a buffer against shutdowns or as a fuel for peak-load generating plant; •de-coupling opportunities where a large, single pyrolysis plant supplies fuel to several small and remote generators; •small-scale combined heat and power opportunities; •sales of the excess char, although a market has yet to be established for this by-product; and •potential co-production of speciality chemicals and fuel for power generation in fast pyrolysis systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Emulsions and microcapsules are typical structures in various dispersion formulations for pharmaceutical, food, personal and house care applications. Precise control over size and size distribution of emulsion droplets and microcapsules are important for effective use and delivery of active components and better product quality. Many emulsification technologies have been developed to meet different formulation and processing requirements. Among them, membrane and microfluidic emulsification as emerging technologies have the feature of being able to precisely manufacture droplets in a drop-by-drop manner to give subscribed sizes and size distributions with lower energy consumption. This paper reviews fundamental sciences and engineering aspects of emulsification, membrane and microfluidic emulsification technologies and their use for precision manufacture of emulsions for intensified processing. Generic application examples are given for single and double emulsions and microcapsules with different structure features. © 2013 The Society of Powder Technology Japan. Published by Elsevier B.V.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Full text: The idea of producing proteins from recombinant DNA hatched almost half a century ago. In his PhD thesis, Peter Lobban foresaw the prospect of inserting foreign DNA (from any source, including mammalian cells) into the genome of a λ phage in order to detect and recover protein products from Escherichia coli [ 1 and 2]. Only a few years later, in 1977, Herbert Boyer and his colleagues succeeded in the first ever expression of a peptide-coding gene in E. coli — they produced recombinant somatostatin [ 3] followed shortly after by human insulin. The field has advanced enormously since those early days and today recombinant proteins have become indispensable in advancing research and development in all fields of the life sciences. Structural biology, in particular, has benefitted tremendously from recombinant protein biotechnology, and an overwhelming proportion of the entries in the Protein Data Bank (PDB) are based on heterologously expressed proteins. Nonetheless, synthesizing, purifying and stabilizing recombinant proteins can still be thoroughly challenging. For example, the soluble proteome is organized to a large part into multicomponent complexes (in humans often comprising ten or more subunits), posing critical challenges for recombinant production. A third of all proteins in cells are located in the membrane, and pose special challenges that require a more bespoke approach. Recent advances may now mean that even these most recalcitrant of proteins could become tenable structural biology targets on a more routine basis. In this special issue, we examine progress in key areas that suggests this is indeed the case. Our first contribution examines the importance of understanding quality control in the host cell during recombinant protein production, and pays particular attention to the synthesis of recombinant membrane proteins. A major challenge faced by any host cell factory is the balance it must strike between its own requirements for growth and the fact that its cellular machinery has essentially been hijacked by an expression construct. In this context, Bill and von der Haar examine emerging insights into the role of the dependent pathways of translation and protein folding in defining high-yielding recombinant membrane protein production experiments for the common prokaryotic and eukaryotic expression hosts. Rather than acting as isolated entities, many membrane proteins form complexes to carry out their functions. To understand their biological mechanisms, it is essential to study the molecular structure of the intact membrane protein assemblies. Recombinant production of membrane protein complexes is still a formidable, at times insurmountable, challenge. In these cases, extraction from natural sources is the only option to prepare samples for structural and functional studies. Zorman and co-workers, in our second contribution, provide an overview of recent advances in the production of multi-subunit membrane protein complexes and highlight recent achievements in membrane protein structural research brought about by state-of-the-art near-atomic resolution cryo-electron microscopy techniques. E. coli has been the dominant host cell for recombinant protein production. Nonetheless, eukaryotic expression systems, including yeasts, insect cells and mammalian cells, are increasingly gaining prominence in the field. The yeast species Pichia pastoris, is a well-established recombinant expression system for a number of applications, including the production of a range of different membrane proteins. Byrne reviews high-resolution structures that have been determined using this methylotroph as an expression host. Although it is not yet clear why P. pastoris is suited to producing such a wide range of membrane proteins, its ease of use and the availability of diverse tools that can be readily implemented in standard bioscience laboratories mean that it is likely to become an increasingly popular option in structural biology pipelines. The contribution by Columbus concludes the membrane protein section of this volume. In her overview of post-expression strategies, Columbus surveys the four most common biochemical approaches for the structural investigation of membrane proteins. Limited proteolysis has successfully aided structure determination of membrane proteins in many cases. Deglycosylation of membrane proteins following production and purification analysis has also facilitated membrane protein structure analysis. Moreover, chemical modifications, such as lysine methylation and cysteine alkylation, have proven their worth to facilitate crystallization of membrane proteins, as well as NMR investigations of membrane protein conformational sampling. Together these approaches have greatly facilitated the structure determination of more than 40 membrane proteins to date. It may be an advantage to produce a target protein in mammalian cells, especially if authentic post-translational modifications such as glycosylation are required for proper activity. Chinese Hamster Ovary (CHO) cells and Human Embryonic Kidney (HEK) 293 cell lines have emerged as excellent hosts for heterologous production. The generation of stable cell-lines is often an aspiration for synthesizing proteins expressed in mammalian cells, in particular if high volumetric yields are to be achieved. In his report, Buessow surveys recent structures of proteins produced using stable mammalian cells and summarizes both well-established and novel approaches to facilitate stable cell-line generation for structural biology applications. The ambition of many biologists is to observe a protein's structure in the native environment of the cell itself. Until recently, this seemed to be more of a dream than a reality. Advances in nuclear magnetic resonance (NMR) spectroscopy techniques, however, have now made possible the observation of mechanistic events at the molecular level of protein structure. Smith and colleagues, in an exciting contribution, review emerging ‘in-cell NMR’ techniques that demonstrate the potential to monitor biological activities by NMR in real time in native physiological environments. A current drawback of NMR as a structure determination tool derives from size limitations of the molecule under investigation and the structures of large proteins and their complexes are therefore typically intractable by NMR. A solution to this challenge is the use of selective isotope labeling of the target protein, which results in a marked reduction of the complexity of NMR spectra and allows dynamic processes even in very large proteins and even ribosomes to be investigated. Kerfah and co-workers introduce methyl-specific isotopic labeling as a molecular tool-box, and review its applications to the solution NMR analysis of large proteins. Tyagi and Lemke next examine single-molecule FRET and crosslinking following the co-translational incorporation of non-canonical amino acids (ncAAs); the goal here is to move beyond static snap-shots of proteins and their complexes and to observe them as dynamic entities. The encoding of ncAAs through codon-suppression technology allows biomolecules to be investigated with diverse structural biology methods. In their article, Tyagi and Lemke discuss these approaches and speculate on the design of improved host organisms for ‘integrative structural biology research’. Our volume concludes with two contributions that resolve particular bottlenecks in the protein structure determination pipeline. The contribution by Crepin and co-workers introduces the concept of polyproteins in contemporary structural biology. Polyproteins are widespread in nature. They represent long polypeptide chains in which individual smaller proteins with different biological function are covalently linked together. Highly specific proteases then tailor the polyprotein into its constituent proteins. Many viruses use polyproteins as a means of organizing their proteome. The concept of polyproteins has now been exploited successfully to produce hitherto inaccessible recombinant protein complexes. For instance, by means of a self-processing synthetic polyprotein, the influenza polymerase, a high-value drug target that had remained elusive for decades, has been produced, and its high-resolution structure determined. In the contribution by Desmyter and co-workers, a further, often imposing, bottleneck in high-resolution protein structure determination is addressed: The requirement to form stable three-dimensional crystal lattices that diffract incident X-ray radiation to high resolution. Nanobodies have proven to be uniquely useful as crystallization chaperones, to coax challenging targets into suitable crystal lattices. Desmyter and co-workers review the generation of nanobodies by immunization, and highlight the application of this powerful technology to the crystallography of important protein specimens including G protein-coupled receptors (GPCRs). Recombinant protein production has come a long way since Peter Lobban's hypothesis in the late 1960s, with recombinant proteins now a dominant force in structural biology. The contributions in this volume showcase an impressive array of inventive approaches that are being developed and implemented, ever increasing the scope of recombinant technology to facilitate the determination of elusive protein structures. Powerful new methods from synthetic biology are further accelerating progress. Structure determination is now reaching into the living cell with the ultimate goal of observing functional molecular architectures in action in their native physiological environment. We anticipate that even the most challenging protein assemblies will be tackled by recombinant technology in the near future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Technology discloses man’s mode of dealing with Nature, the process of production by which he sustains his life, and thereby also lays bare the mode of formation of his social relations, and of the mental conceptions that flow from them (Marx, 1990: 372) My thesis is a Sociological analysis of UK policy discourse for educational technology during the last 15 years. My framework is a dialogue between the Marxist-based critical social theory of Lieras and a corpus-based Critical Discourse Analysis (CDA) of UK policy for Technology Enhanced Learning (TEL) in higher education. Embedded in TEL is a presupposition: a deterministic assumption that technology has enhanced learning. This conceals a necessary debate that reminds us it is humans that design learning, not technology. By omitting people, TEL provides a vehicle for strong hierarchical or neoliberal, agendas to make simplified claims politically, in the name of technology. My research has two main aims: firstly, I share a replicable, mixed methodological approach for linguistic analysis of the political discourse of TEL. Quantitatively, I examine patterns in my corpus to question forms of ‘use’ around technology that structure a rigid basic argument which ‘enframes’ educational technology (Heidegger, 1977: 38). In a qualitative analysis of findings, I ask to what extent policy discourse evaluates technology in one way, to support a Knowledge Based Economy (KBE) in a political economy of neoliberalism (Jessop 2004, Fairclough 2006). If technology is commodified as an external enhancement, it is expected to provide an ‘exchange value’ for learners (Marx, 1867). I therefore examine more closely what is prioritised and devalued in these texts. Secondly, I disclose a form of austerity in the discourse where technology, as an abstract force, undertakes tasks usually ascribed to humans (Lieras, 1996, Brey, 2003:2). This risks desubjectivisation, loss of power and limits people’s relationships with technology and with each other. A view of technology in political discourse as complete without people closes possibilities for broader dialectical (Fairclough, 2001, 2007) and ‘convivial’ (Illich, 1973) understandings of the intimate, material practice of engaging with technology in education. In opening the ‘black box’ of TEL via CDA I reveal talking points that are otherwise concealed. This allows me as to be reflexive and self-critical through praxis, to confront my own assumptions about what the discourse conceals and what forms of resistance might be required. In so doing, I contribute to ongoing debates about networked learning, providing a context to explore educational technology as a technology, language and learning nexus.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This chapter examines the fast pyrolysis of biomass to produce liquids for use as fuels and chemicals. The technology for fast pyrolysis is described and the characteristics of the main product bio-oil. This primary liquid is characterised by the many properties that affect its use. These properties have caused increasingly extensive research to be undertaken to address properties that need modification and this area is reviewed in terms of physical, catalytic and chemical upgrading. Of particular note is the increasing diversity of upgrading methods. © 2013 Woodhead Publishing Limited All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Radio frequency identification (RFID) technology has gained increasing popularity in businesses to improve operational efficiency and maximise costs saving. However, there is a gap in the literature exploring the enhanced use of RFID to substantially add values to the supply chain operations, especially beyond what the RFID vendors could offer. This paper presents a multi-agent system, incorporating RFID technology, aimed at fulfilling the gap. The system is developed to model supply chain activities (in particular, logistics operations) and is comprised of autonomous and intelligent agents representing the key entities in the supply chain. With the advanced characteristics of RFID incorporated, the agent system examines ways logistics operations (i.e. distribution network) particular) can be efficiently reconfigured and optimised in response to dynamic changes in the market, production and at any stage in the supply chain. © 2012 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Saturation mutagenesis is a powerful tool in modern protein engineering, which permits key residues within a protein to be targeted in order to potentially enhance specific functionalities. However, the creation of large libraries using conventional saturation mutagenesis with degenerate codons (NNN or NNK/S) has inherent redundancy and consequent disparities in codon representation. Therefore, both chemical (trinucleotide phosphoramidites) and biological methods (sequential, enzymatic single codon additions) of non-degenerate saturation mutagenesis have been developed in order to combat these issues and so improve library quality. Large libraries with multiple saturated positions can be limited by the method used to screen them. Although the traditional screening method of choice, cell-dependent methods, such as phage display, are limited by the need for transformation. A number of cell-free screening methods, such as CIS display, which link the screened phenotype with the encoded genotype, have the capability of screening libraries with up to 1014 members. This thesis describes the further development of ProxiMAX technology to reduce library codon bias and its integration with CIS display to screen the resulting library. Synthetic MAX oligonucleotides are ligated to an acceptor base sequence, amplified, and digested, subsequently adding a randomised codon to the acceptor, which forms an iterative cycle using the digested product of the previous cycle as the base sequence for the next. Initial use of ProxiMAX highlighted areas of the process where changes could be implemented in order to improve the codon representation in the final library. The refined process was used to construct a monomeric anti-NGF peptide library, based on two proprietary dimeric peptides (Isogenica) that bind NGF. The resulting library showed greatly improved codon representation that equated to a theoretical diversity of ~69%. The library was subsequently screened using CIS display and the discovered peptides assessed for NGF-TrkA inhibition by ELISA. Despite binding to TrkA, these peptides showed lower levels of inhibition of the NGF-TrkA interaction than the parental dimeric peptides, highlighting the importance of dimerization for inhibition of NGF-TrkA binding.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Az elemzés egy, a Budapesti Corvinus Egyetem (BCE) Logisztika és Ellátási Lánc Menedzsment Tanszéke által végzett kérdőíves felmérés eredményeit foglalja össze. A kutatás alapvető célja, hogy felmérje és bemutassa a hazai vállalatok logisztikai, ezen belül is elsősorban disztribúciós logisztikai folyamatainak informatikai oldalról történő jelenlegi támogatottsági szintjét és a következő két-három év e téren várható fejlesztési irányait. A kutatás szisztematikusan kitért a logisztikai információs rendszer valamennyi alrendszerére, vizsgálta a különböző azonosítási megoldások elterjedtségét, a vállalatirányítási rendszer, illetve egyes moduljainak használatával kapcsolatban kialakult gyakorlatot, de a logisztika stratégiai döntéseinek informatikai támogatottságát és a használt kommunikációs technikákat is. Összességében megállapíthatjuk, hogy a logisztikai információs rendszerek fejlettségi szintje ma Magyarországon közepesnek mondható, fontos megjegyezni azonban, hogy a KKV szektor e téren is jelentős lemaradással rendelkezik. Ez természetesen azt is jelent, hogy az informatikai eszközök alkalmazásának kiterjesztésével még jelentős teljesítményjavulás érhető el. = The essay summarizes the results of a survey carried out by Corvinus University of Budapest, Department of Logistics and Supply Chain Management. Aim of the survey was to analyze and describe the actual Hungarian company practice regarding the IT support of logistics – and particularly distribution – processes, and the plans of developing it within the next 2-3 years. Survey has systematically covered all fields of logistics information system, analyzed the prevalence of different identification techniques and systems. On the whole we appoint that logistics information systems applied by Hungarian companies are on satisfactory level; however it is important to tell that SME companies are in huge lag. This means that improving logistics information system hides the possibility of considerable performance development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Koopmans gyakorlati problémák megoldása során szerzett tapasztalatait általánosítva fogott hozzá a lineáris tevékenységelemzési modell kidolgozásához. Meglepődve tapasztalta, hogy a korabeli közgazdaságtan nem rendelkezett egységes, kellően egzakt termeléselmélettel és fogalomrendszerrel. Úttörő dolgozatában ezért - mintegy a lineáris tevékenységelemzési modell elméleti kereteként - lerakta a technológiai halmazok fogalmán nyugvó axiomatikus termeléselmélet alapjait is. Nevéhez fűződik a termelési hatékonyság és a hatékonysági árak fogalmának egzakt definíciója, s az egymást kölcsönösen feltételező viszonyuk igazolása a lineáris tevékenységelemzési modell keretében. A hatékonyság manapság használatos, pusztán műszaki szempontból értelmezett definícióját Koopmans csak sajátos esetként tárgyalta, célja a gazdasági hatékonyság fogalmának a bevezetése és elemzése volt. Dolgozatunkban a lineáris programozás dualitási tételei segítségével rekonstruáljuk ez utóbbira vonatkozó eredményeit. Megmutatjuk, hogy egyrészt bizonyításai egyenértékűek a lineáris programozás dualitási tételeinek igazolásával, másrészt a gazdasági hatékonysági árak voltaképpen a mai értelemben vett árnyékárak. Rámutatunk arra is, hogy a gazdasági hatékonyság értelmezéséhez megfogalmazott modellje az Arrow-Debreu-McKenzie-féle általános egyensúlyelméleti modellek közvetlen előzményének tekinthető, tartalmazta azok szinte minden lényeges elemét és fogalmát - az egyensúlyi árak nem mások, mint a Koopmans-féle hatékonysági árak. Végezetül újraértelmezzük Koopmans modelljét a vállalati technológiai mikroökonómiai leírásának lehetséges eszközeként. Journal of Economic Literature (JEL) kód: B23, B41, C61, D20, D50. /===/ Generalizing from his experience in solving practical problems, Koopmans set about devising a linear model for analysing activity. Surprisingly, he found that economics at that time possessed no uniform, sufficiently exact theory of production or system of concepts for it. He set out in a pioneering study to provide a theoretical framework for a linear model for analysing activity by expressing first the axiomatic bases of production theory, which rest on the concept of technological sets. He is associated with exact definition of the concept of production efficiency and efficiency prices, and confirmation of their relation as mutual postulates within the linear model of activity analysis. Koopmans saw the present, purely technical definition of efficiency as a special case; he aimed to introduce and analyse the concept of economic efficiency. The study uses the duality precepts of linear programming to reconstruct the results for the latter. It is shown first that evidence confirming the duality precepts of linear programming is equal in value, and secondly that efficiency prices are really shadow prices in today's sense. Furthermore, the model for the interpretation of economic efficiency can be seen as a direct predecessor of the Arrow–Debreu–McKenzie models of general equilibrium theory, as it contained almost every essential element and concept of them—equilibrium prices are nothing other than Koopmans' efficiency prices. Finally Koopmans' model is reinterpreted as a necessary tool for microeconomic description of enterprise technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Az elemzés egy a Budapesti Corvinus Egyetem (BCE) Logisztika és Ellátási Lánc Menedzsment Tanszéke által végzett kérdőíves felmérés eredményeit foglalja össze. A kutatás alapvető célja, hogy felmérje és bemutassa a hazai vállalatok logisztikai, ezen belül is elsősorban disztribúciós logisztikai folyamatainak informatikai oldalról történő jelenlegi támogatottsági szintjét és a következő két-három év e téren várható fejlesztési irányait. A kutatás szisztematikusan kitért a logisztikai információs rendszer valamennyi alrendszerére, vizsgálta a különböző azonosítási megoldások elterjedtségét, a vállalatirányítási rendszer, illetve egyes moduljainak használatával kapcsolatban kialakult gyakorlatot, de a logisztika stratégiai döntéseinek informatikai támogatottságát és a használt kommunikációs technikákat is. Összességében megállapítható, hogy a logisztikai információs rendszerek fejlettségi szintje ma Magyarországon közepes, fontos megjegyezni azonban, hogy a kkv-szektor e téren is jelentősen lemaradt. Ez természetesen azt is jelenti, hogy az informatikai eszközök alkalmazásának kiterjesztésével még komoly teljesítményjavulás érhető el. ________ The essay summarizes the results of a survey carried out by Corvinus University of Budapest, Department of Logistics and Supply Chain Management. Aim of the survey was to analyze and describe the actual Hungarian company practice regarding the IT support of logistics – and particularly distribution – processes, and the plans to develop it within the next 2-3 years. Survey has systematically overviewed all fields of logistics information system, analyzed the prevalence of different identification techniques and systems. Generally the authors appoint that logistics information systems applied by Hungarian companies are on satisfactory level; however it is important to tell that SME companies are in huge lag. This means that improving logistics information system hides the possibility of considerable performance development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A cikkben paneladatok segítségével a magyar gabonatermesztő üzemek 2001 és 2009 közötti technikai hatékonyságát vizsgáljuk. A technikai hatékonyság szintjének becslésére egy hagyományos sztochasztikus határok modell (SFA) mellett a látens csoportok modelljét (LCM) használjuk, amely figyelembe veszi a technológiai különbségeket is. Eredményeink arra utalnak, hogy a technológiai heterogenitás fontos lehet egy olyan ágazatban is, mint a szántóföldi növénytermesztés, ahol viszonylag homogén technológiát alkalmaznak. A hagyományos, azonos technológiát feltételező és a látens osztályok modelljeinek összehasonlítása azt mutatja, hogy a gabonatermesztő üzemek technikai hatékonyságát a hagyományos modellek alábecsülhetik. _____ The article sets out to analyse the technical efficiency of Hungarian crop farms between 2001 and 2009, using panel data and employing both standard stochastic frontier analysis and the latent class model (LCM) to estimate technical efficiency. The findings suggest that technological heterogeneity plays an important role in the crop sector, though it is traditionally assumed to employ homogenous technology. A comparison of standard SFA models that assumes the technology is common to all farms and LCM estimates highlights the way the efficiency of crop farms can be underestimated using traditional SFA models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Security remains a top priority for organizations as their information systems continue to be plagued by security breaches. This dissertation developed a unique approach to assess the security risks associated with information systems based on dynamic neural network architecture. The risks that are considered encompass the production computing environment and the client machine environment. The risks are established as metrics that define how susceptible each of the computing environments is to security breaches. ^ The merit of the approach developed in this dissertation is based on the design and implementation of Artificial Neural Networks to assess the risks in the computing and client machine environments. The datasets that were utilized in the implementation and validation of the model were obtained from business organizations using a web survey tool hosted by Microsoft. This site was designed as a host site for anonymous surveys that were devised specifically as part of this dissertation. Microsoft customers can login to the website and submit their responses to the questionnaire. ^ This work asserted that security in information systems is not dependent exclusively on technology but rather on the triumvirate people, process and technology. The questionnaire and consequently the developed neural network architecture accounted for all three key factors that impact information systems security. ^ As part of the study, a methodology on how to develop, train and validate such a predictive model was devised and successfully deployed. This methodology prescribed how to determine the optimal topology, activation function, and associated parameters for this security based scenario. The assessment of the effects of security breaches to the information systems has traditionally been post-mortem whereas this dissertation provided a predictive solution where organizations can determine how susceptible their environments are to security breaches in a proactive way. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this thesis was to build the Guitar Application ToolKit (GATK), a series of applications used to expand the sonic capabilities of the acoustic/electric stereo guitar. Furthermore, the goal of the GATK was to extend improvisational capabilities and the compositional techniques generated by this innovative instrument. ^ During the GATK creation process, the current production guitar techniques and overall sonic result were enhanced by planning and implementing a personalized electro-acoustic performance set up, designing custom-made performance interfaces, creating interactive compositional strategies, crafting non-standardized sounds, and controlling various music parameters in real-time using the Max/MSP programming environment. ^ This was the fast thesis project of its kind. It is expected that this thesis will be useful as a reference paper for electronic musicians and music technology students; as a product demonstration for companies that manufacture the relevant software; and as a personal portfolio for future technology related jobs. ^