757 resultados para prerequisite
Resumo:
There is currently significant interest in particle-stabilized emulsions for a variety of applications and as precursors to other materials such as micro-capsules or colloidosomes. A prerequisite for many applications is the ability to produce stable droplets with a well-controlled size. The preparation of oil-in-water (o/w) emulsions stabilized by silica colloids has been demonstrated here using membrane ulsification techniques. Emulsions were produced using both a cross-flow membrane device and a rotating membrane reactor. Under the correct conditions, highly stable emulsions with very narrow droplet size distributions can be produced. Investigations into the effects of changing the cross-flow shear rate at a fixed droplet production rate illustrate the fine control over mean droplet size that is possible with these emulsification techniques. Evidence for the importance of particle adsorption kinetics onto growing droplets prior to detachment from the membrane surface was obtained by varying the droplet production rate under fixed shear conditions. The presence of a critical surface coverage by the stabilizing particles to prevent droplet coalescence was clearly seen. Comparison with samples produced using conventional high-shear homogenization highlights the improved control over size distribution available from these membrane techniques.
Resumo:
Nickel and cobalt are both known to stimulate the hypoxia-inducible factor-1 (HIF-1a), thus significantly improving blood vessel formation in tissue engineering applications. We have manufactured nickel and cobalt doped bioactive glasses to act as a controlled delivery mechanism of these ions. The resultant structural consequences have been investigated using the methods of isotopic and isomorphic substitution applied to neutron diffraction. The structural sites present will be intimately related to their release properties in physiological fluids such as plasma and saliva, and hence the bioactivity of the material. Detailed structural knowledge is therefore a prerequisite for optimising material design. Results show that nickel and cobalt adopt a mixed structural role within these bioactive glasses occupying both network-forming (tetrahedral) and network-modifying (5-fold) geometries. Two thirds of the Ni (or Co) occupies a five-fold geometry with the remaining third in a tetrahedral environment. A direct comparison of the primary structural correlations (e.g. Si-O, Ca-O, Na-O and O-Si-O) between the archetypal 45S5 Bioglass® and the Ni and Co glasses studied here reveal no significant differences. This indicates that the addition of Ni (or Co) will have no adverse effects on the existing structure, and thus on in vitro/in vivo dissolution rates and therefore bioactivity of these glasses.
Resumo:
A view has emerged within manufacturing and service organizations that the operations management function can hold the key to achieving competitive edge. This has recently been emphasized by the demands for greater variety and higher quality which must be set against a background of increasing cost of resources. As nations' trade barriers are progressively lowered and removed, so producers of goods and service products are becoming more exposed to competition that may come from virtually anywhere around the world. To simply survive in this climate many organizations have found it necessary to improve their manufacturing or service delivery systems. To become real ''winners'' some have adopted a strategic approach to operations and completely reviewed and restructured their approach to production system design and operations planning and control. The articles in this issue of the International journal of Operations & Production Management have been selected to illustrate current thinking and practice in relation to this situation. They are all based on papers presented to the Sixth International Conference of the Operations Management Association-UK which was held at Aston University in June 1991. The theme of the conference was "Achieving Competitive Edge" and authors from 15 countries around the world contributed to more than 80 presented papers. Within this special issue five topic areas are addressed with two articles relating to each. The topics are: strategic management of operations; managing change; production system design; production control; and service operations. Under strategic management of operations De Toni, Filippini and Forza propose a conceptual model which considers the performance of an operating system as a source of competitive advantage through the ''operation value chain'' of design, purchasing, production and distribution. Their model is set within the context of the tendency towards globalization. New's article is somewhat in contrast to the more fashionable literature on operations strategy. It challenges the validity of the current idea of ''world-class manufacturing'' and, instead, urges a reconsideration of the view that strategic ''trade-offs'' are necessary to achieve a competitive edge. The importance of managing change has for some time been recognized within the field of organization studies but its relevance in operations management is now being realized. Berger considers the use of "organization design", ''sociotechnical systems'' and change strategies and contrasts these with the more recent idea of the ''dialogue perspective''. A tentative model is suggested to improve the analysis of different strategies in a situation specific context. Neely and Wilson look at an essential prerequisite if change is to be effected in an efficient way, namely product goal congruence. Using a case study as its basis, their article suggests a method of measuring goal congruence as a means of identifying the extent to which key performance criteria relating to quality, time, cost and flexibility are understood within an organization. The two articles on production systems design represent important contributions to the debate on flexible production organization and autonomous group working. Rosander uses the results from cases to test the applicability of ''flow groups'' as the optimal way of organizing batch production. Schuring also examines cases to determine the reasons behind the adoption of ''autonomous work groups'' in The Netherlands and Sweden. Both these contributions help to provide a greater understanding of the production philosophies which have emerged as alternatives to more conventional systems -------for intermittent and continuous production. The production control articles are both concerned with the concepts of ''push'' and ''pull'' which are the two broad approaches to material planning and control. Hirakawa, Hoshino and Katayama have developed a hybrid model, suitable for multistage manufacturing processes, which combines the benefits of both systems. They discuss the theoretical arguments in support of the system and illustrate its performance with numerical studies. Slack and Correa's concern is with the flexibility characteristics of push and pull material planning and control systems. They use the case of two plants using the different systems to compare their performance within a number of predefined flexibility types. The two final contributions on service operations are complementary. The article by Voss really relates to manufacturing but examines the application of service industry concepts within the UK manufacturing sector. His studies in a number of companies support the idea of the ''service factory'' and offer a new perspective for manufacturing. Harvey's contribution by contrast, is concerned with the application of operations management principles in the delivery of professional services. Using the case of social-service provision in Canada, it demonstrates how concepts such as ''just-in-time'' can be used to improve service performance. The ten articles in this special issue of the journal address a wide range of issues and situations. Their common aspect is that, together, they demonstrate the extent to which competitiveness can be improved via the application of operations management concepts and techniques.
Resumo:
The reactivity of Amberlite (IRA-67) base "heterogeneous" resin in Sonogashira cross-coupling of 8-bromoguanosine 1 with phenylacetylene 3 to give 2 has been examined. Both 1 and 2 coordinate to Pd and Cu ions, which explains why at equivalent catalyst loadings, the homogeneous reaction employing triethylamine base is poor yielding. X-ray photo-electron spectroscopy (XPS) has been used to probe and quantify the active nitrogen base sites of the Amberlite resin, and postreaction Pd and Cu species. The Pd2Cl3(PPh)2 precatalyst and CuI cocatalyst degrade to give Amberlite-supported metal nanoparticles (average size ∼2.7 nm). The guanosine product 2 formed using the Amberlite Pd/Cu catalyst system is of higher purity than reactions using a homogeneous Pd precatalyst, a prerequisite for use in biological applications. Copyright © Taylor and Francis Group, LLC.
Resumo:
The accurate in silico identification of T-cell epitopes is a critical step in the development of peptide-based vaccines, reagents, and diagnostics. It has a direct impact on the success of subsequent experimental work. Epitopes arise as a consequence of complex proteolytic processing within the cell. Prior to being recognized by T cells, an epitope is presented on the cell surface as a complex with a major histocompatibility complex (MHC) protein. A prerequisite therefore for T-cell recognition is that an epitope is also a good MHC binder. Thus, T-cell epitope prediction overlaps strongly with the prediction of MHC binding. In the present study, we compare discriminant analysis and multiple linear regression as algorithmic engines for the definition of quantitative matrices for binding affinity prediction. We apply these methods to peptides which bind the well-studied human MHC allele HLA-A*0201. A matrix which results from combining results of the two methods proved powerfully predictive under cross-validation. The new matrix was also tested on an external set of 160 binders to HLA-A*0201; it was able to recognize 135 (84%) of them.
Resumo:
The accurate identification of T-cell epitopes remains a principal goal of bioinformatics within immunology. As the immunogenicity of peptide epitopes is dependent on their binding to major histocompatibility complex (MHC) molecules, the prediction of binding affinity is a prerequisite to the reliable prediction of epitopes. The iterative self-consistent (ISC) partial-least-squares (PLS)-based additive method is a recently developed bioinformatic approach for predicting class II peptide−MHC binding affinity. The ISC−PLS method overcomes many of the conceptual difficulties inherent in the prediction of class II peptide−MHC affinity, such as the binding of a mixed population of peptide lengths due to the open-ended class II binding site. The method has applications in both the accurate prediction of class II epitopes and the manipulation of affinity for heteroclitic and competitor peptides. The method is applied here to six class II mouse alleles (I-Ab, I-Ad, I-Ak, I-As, I-Ed, and I-Ek) and included peptides up to 25 amino acids in length. A series of regression equations highlighting the quantitative contributions of individual amino acids at each peptide position was established. The initial model for each allele exhibited only moderate predictivity. Once the set of selected peptide subsequences had converged, the final models exhibited a satisfactory predictive power. Convergence was reached between the 4th and 17th iterations, and the leave-one-out cross-validation statistical terms - q2, SEP, and NC - ranged between 0.732 and 0.925, 0.418 and 0.816, and 1 and 6, respectively. The non-cross-validated statistical terms r2 and SEE ranged between 0.98 and 0.995 and 0.089 and 0.180, respectively. The peptides used in this study are available from the AntiJen database (http://www.jenner.ac.uk/AntiJen). The PLS method is available commercially in the SYBYL molecular modeling software package. The resulting models, which can be used for accurate T-cell epitope prediction, will be made freely available online (http://www.jenner.ac.uk/MHCPred).
Resumo:
Purpose – The purpose of this paper is to explore the importance of host country networks and organisation of production in the context of international technology transfer that accompanies foreign direct investment (FDI). Design/methodology/approach – The empirical analysis is based on unbalanced panel data covering Japanese firms active in two-digit manufacturing sectors over a seven-year period. Given the self-selection problem affecting past sectoral-level studies, using firm-level panel data is a prerequisite to provide robust empirical evidence. Findings – While Japan is thought of as being a technologically advanced country, the results show that vertical productivity spillovers from FDI occur in Japan, but they are sensitive to technological differences between domestic firms and the idiosyncratic Japanese institutional network. FDI in vertically organised keiretsu sectors generates inter-industry spillovers through backward and forward linkages, while FDI within sectors linked to vertical keiretsu activities adversely affects domestic productivity. Overall, our results suggest that the role of vertical keiretsu is more prevalent than that of horizontal keiretsu. Originality/value – Japan’s industrial landscape has been dominated by institutional clusters or networks of inter-firm organisations through reciprocated, direct and indirect ties. However, interactions between inward investors and such institutionalised networks in the host economy are seldom explored. The role and characteristics of local business groups, in the form of keiretsu networks, have been investigated to determine the scale and scope of spillovers from inward FDI to Japanese establishments. This conceptualisation depends on the institutional mechanism and the market structure through which host economies absorb and exploit FDI.
Resumo:
The compact and visualized documenting of information business modeling is a major prerequisite for comprehending its basic concepts, as well as for its effective application and improvement. The documenting of this process is related to its modeling. Thus, the process of information business modeling can be represented by its own tools. Being based on this thesis, the authors suggest an approach to representing the process of information business modeling. A profile for its documenting has been developed for the purpose.
Resumo:
We report observations and measurements of the inscription of fiber Bragg gratings (FBGs) in two different types of microstructured polymer optical fiber: few-mode and an endlessly single mode. Contrary to the FBG inscription in silica microstructured fiber, where high-energy laser pulses are a prerequisite, we have successfully used a low-power cw laser source operating at 325 nm to produce 1 cm long gratings with a reflection peak at 1570 nm. Peak reflectivities of more than 10% have been observed. © 2005 Optical Society of America.
Resumo:
Motivation: In molecular biology, molecular events describe observable alterations of biomolecules, such as binding of proteins or RNA production. These events might be responsible for drug reactions or development of certain diseases. As such, biomedical event extraction, the process of automatically detecting description of molecular interactions in research articles, attracted substantial research interest recently. Event trigger identification, detecting the words describing the event types, is a crucial and prerequisite step in the pipeline process of biomedical event extraction. Taking the event types as classes, event trigger identification can be viewed as a classification task. For each word in a sentence, a trained classifier predicts whether the word corresponds to an event type and which event type based on the context features. Therefore, a well-designed feature set with a good level of discrimination and generalization is crucial for the performance of event trigger identification. Results: In this article, we propose a novel framework for event trigger identification. In particular, we learn biomedical domain knowledge from a large text corpus built from Medline and embed it into word features using neural language modeling. The embedded features are then combined with the syntactic and semantic context features using the multiple kernel learning method. The combined feature set is used for training the event trigger classifier. Experimental results on the golden standard corpus show that >2.5% improvement on F-score is achieved by the proposed framework when compared with the state-of-the-art approach, demonstrating the effectiveness of the proposed framework. © 2014 The Author 2014. The source code for the proposed framework is freely available and can be downloaded at http://cse.seu.edu.cn/people/zhoudeyu/ETI_Sourcecode.zip.
Resumo:
Cellular peptide vaccines contain T-cell epitopes. The main prerequisite for a peptide to act as a T-cell epitope is that it binds to a major histocompatibility complex (MHC) protein. Peptide MHC binder identification is an extremely costly experimental challenge since human MHCs, named human leukocyte antigen, are highly polymorphic and polygenic. Here we present EpiDOCK, the first structure-based server for MHC class II binding prediction. EpiDOCK predicts binding to the 23 most frequent human, MHC class II proteins. It identifies 90% of true binders and 76% of true non-binders, with an overall accuracy of 83%. EpiDOCK is freely accessible at http://epidock.ddg-pharmfac. net. © The Author 2013. Published by Oxford University Press. All rights reserved.
Resumo:
Agile methodologies are becoming more popular in the software development process nowadays. The iterative development lifecycle, openness to frequent changes, tight cooperation with the client and among the software engineers are turning into more and more effective practices and respond to a higher extend to the current business needs. It is natural to raise the question which methodology is the most suitable for use when starting and managing a project. This depends on many factors—product characteristics, technologies used, client’s and developer’s experience, project type. A systematic analysis of the most common problems appearing when developing a particular type of projects—public portal solutions, is proposed. In the case at hand a very close interaction with various types of end users is observed. This is a prerequisite for permanent changes during the development and support cycles, which makes them ideal candidates for using an agile methodology. We will compare the ways in which each methodology addresses the specific problems arising and will finish with ranking them according to their relevance. This might help the project manager in choosing one or a combination of the methodologies.
Resumo:
Az elektronikus hírközlő hálózat rohamszerű fejlesztésének igénye az elektronikus szolgáltatások széles körű elterjedésével az állami döntéshozókat is fejlesztéspolitikai koncepciók kidolgozására és azok végrehajtására ösztönzi. Az (információs) társadalom fejlődése és az ennek alapjául szolgáló infokommunikációs szolgáltatások használata alapvetően függ a szélessávú infrastruktúra fejlesztésétől, az elektronikus hírközlő hálózat elérésének lehetőségétől. Az állami szerepvállalási hajlandóság 2011-től kezdődően jelentősen megnőtt az elektronikus hírközlési területen. Az MVM NET Zrt. megalapítása, a NISZ Zrt. átszervezése, a GOP 3.1.2-es pályázat és a 4. mobilszolgáltató létrehozásának terve mind mutatják a kormányzat erőteljes szándékát a terület fejlesztésére. A tanulmányban bemutatásra kerül, hogy az állam milyen beavatkozási eszközökkel rendelkezik az elektronikus hírközlő hálózat fejlesztésének ösztönzésére. A szerző ezt követően a négy, jelentős állami beavatkozás elemzését végzi el annak vizsgálatára, hogy megfelelő alapozottsággal született-e döntés az állami szerepvállalásról. _____ With the widespread use of the Internet, the need for the rapid development of the digital communication networks has prompted government policy makers also to conceptualize and implement development policy. The advancement of the (information) society and the use of information communication technology as a prerequisite of it are fundamentally determined by the development of broadband infrastructure and whether broadband access to the digital telecommunication network is available. The propensity of the government to play a bigger role in the field of electronical communication has increased significantly from 2011. The setup of MVM NET Zrt. / Hungarian Electricity NET Ltd./, the realignment of NISZ Zrt. / National Info communication Services Company Limited by Shares - NISZ Ltd./, the GOP 3.1.2. tender and the plan to enable a new, i.e. the fourth mobile network operator to enter the market all indicate the robust intention of the government to develop this field. The study shows the tools of government intervention for the incentive of the development of the electronical communication network. Then the author analyses the four main government interventions to examine whether the decision on the role of the state was adequately well-founded.
Resumo:
The National Council Licensure Examination for Registered Nurses (NCLEX-RN) is the examination that all graduates of nursing education programs must pass to attain the title of registered nurse. Currently the NCLEX-RN passing rate is at an all-time low (81%) for first-time test takers (NCSBN, 2004); amidst a nationwide shortage of registered nurses (Glabman, 2001). Because of the critical need to supply greater numbers of professional nurses, and the potential accreditation ramifications that low NCLEX-RN passing rates can have on schools of nursing and graduates, this research study tests the effectiveness of a predictor model. This model is based upon the theoretical framework of McClusky's (1959) theory of margin (ToM), with the hope that students found to be at-risk for NCLEX-RN failure can be identified and remediated prior to taking the actual licensure examination. To date no theory based predictor model has been identified that predicts success on the NCLEX-RN. ^ The model was tested using prerequisite course grades, nursing course grades and scores on standardized examinations for the 2003 associate degree nursing graduates at a urban community college (N = 235). Success was determined through the reporting of pass on the NCLEX-RN examination by the Florida Board of Nursing. Point biserial correlations tested model assumptions regarding variable relationships, while logistic regression was used to test the model's predictive power. ^ Correlations among variables were significant and the model accounted for 66% of variance in graduates' success on the NCLEX-RN with 98% prediction accuracy. Although certain prerequisite course grades and nursing course grades were found to be significant to NCLEX-RN success, the overall model was found to be most predictive at the conclusion of the academic program of study. The inclusion of the RN Assessment Examination, taken during the final semester of course work, was the most significant predictor of NCLEX-RN success. Success on the NCLEX-RN allows graduates to work as registered nurses, reflects positively on a school's academic performance record, and supports the appropriateness of the educational program's goals and objectives. The study's findings support potential other uses of McClusky's theory of margin as a predictor of program outcome in other venues of adult education. ^
Resumo:
This work is the first work using patterned soft underlayers in multilevel three-dimensional vertical magnetic data storage systems. The motivation stems from an exponentially growing information stockpile, and a corresponding need for more efficient storage devices with higher density. The world information stockpile currently exceeds 150EB (ExaByte=1x1018Bytes); most of which is in analog form. Among the storage technologies (semiconductor, optical and magnetic), magnetic hard disk drives are posed to occupy a big role in personal, network as well as corporate storage. However; this mode suffers from a limit known as the Superparamagnetic limit; which limits achievable areal density due to fundamental quantum mechanical stability requirements. There are many viable techniques considered to defer superparamagnetism into the 100's of Gbit/in2 such as: patterned media, Heat-Assisted Magnetic Recording (HAMR), Self Organized Magnetic Arrays (SOMA), antiferromagnetically coupled structures (AFC), and perpendicular magnetic recording. Nonetheless, these techniques utilize a single magnetic layer; and can thusly be viewed as two-dimensional in nature. In this work a novel three-dimensional vertical magnetic recording approach is proposed. This approach utilizes the entire thickness of a magnetic multilayer structure to store information; with potential areal density well into the Tbit/in2 regime. ^ There are several possible implementations for 3D magnetic recording; each presenting its own set of requirements, merits and challenges. The issues and considerations pertaining to the development of such systems will be examined, and analyzed using empirical and numerical analysis techniques. Two novel key approaches are proposed and developed: (1) Patterned soft underlayer (SUL) which allows for enhanced recording of thicker media, (2) A combinatorial approach for 3D media development that facilitates concurrent investigation of various film parameters on a predefined performance metric. A case study is presented using combinatorial overcoats of Tantalum and Zirconium Oxides for corrosion protection in magnetic media. ^ Feasibility of 3D recording is demonstrated, and an emphasis on 3D media development is emphasized as a key prerequisite. Patterned SUL shows significant enhancement over conventional "un-patterned" SUL, and shows that geometry can be used as a design tool to achieve favorable field distribution where magnetic storage and magnetic phenomena are involved. ^