711 resultados para prerequisite


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The accurate in silico identification of T-cell epitopes is a critical step in the development of peptide-based vaccines, reagents, and diagnostics. It has a direct impact on the success of subsequent experimental work. Epitopes arise as a consequence of complex proteolytic processing within the cell. Prior to being recognized by T cells, an epitope is presented on the cell surface as a complex with a major histocompatibility complex (MHC) protein. A prerequisite therefore for T-cell recognition is that an epitope is also a good MHC binder. Thus, T-cell epitope prediction overlaps strongly with the prediction of MHC binding. In the present study, we compare discriminant analysis and multiple linear regression as algorithmic engines for the definition of quantitative matrices for binding affinity prediction. We apply these methods to peptides which bind the well-studied human MHC allele HLA-A*0201. A matrix which results from combining results of the two methods proved powerfully predictive under cross-validation. The new matrix was also tested on an external set of 160 binders to HLA-A*0201; it was able to recognize 135 (84%) of them.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The accurate identification of T-cell epitopes remains a principal goal of bioinformatics within immunology. As the immunogenicity of peptide epitopes is dependent on their binding to major histocompatibility complex (MHC) molecules, the prediction of binding affinity is a prerequisite to the reliable prediction of epitopes. The iterative self-consistent (ISC) partial-least-squares (PLS)-based additive method is a recently developed bioinformatic approach for predicting class II peptide−MHC binding affinity. The ISC−PLS method overcomes many of the conceptual difficulties inherent in the prediction of class II peptide−MHC affinity, such as the binding of a mixed population of peptide lengths due to the open-ended class II binding site. The method has applications in both the accurate prediction of class II epitopes and the manipulation of affinity for heteroclitic and competitor peptides. The method is applied here to six class II mouse alleles (I-Ab, I-Ad, I-Ak, I-As, I-Ed, and I-Ek) and included peptides up to 25 amino acids in length. A series of regression equations highlighting the quantitative contributions of individual amino acids at each peptide position was established. The initial model for each allele exhibited only moderate predictivity. Once the set of selected peptide subsequences had converged, the final models exhibited a satisfactory predictive power. Convergence was reached between the 4th and 17th iterations, and the leave-one-out cross-validation statistical terms - q2, SEP, and NC - ranged between 0.732 and 0.925, 0.418 and 0.816, and 1 and 6, respectively. The non-cross-validated statistical terms r2 and SEE ranged between 0.98 and 0.995 and 0.089 and 0.180, respectively. The peptides used in this study are available from the AntiJen database (http://www.jenner.ac.uk/AntiJen). The PLS method is available commercially in the SYBYL molecular modeling software package. The resulting models, which can be used for accurate T-cell epitope prediction, will be made freely available online (http://www.jenner.ac.uk/MHCPred).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to explore the importance of host country networks and organisation of production in the context of international technology transfer that accompanies foreign direct investment (FDI). Design/methodology/approach – The empirical analysis is based on unbalanced panel data covering Japanese firms active in two-digit manufacturing sectors over a seven-year period. Given the self-selection problem affecting past sectoral-level studies, using firm-level panel data is a prerequisite to provide robust empirical evidence. Findings – While Japan is thought of as being a technologically advanced country, the results show that vertical productivity spillovers from FDI occur in Japan, but they are sensitive to technological differences between domestic firms and the idiosyncratic Japanese institutional network. FDI in vertically organised keiretsu sectors generates inter-industry spillovers through backward and forward linkages, while FDI within sectors linked to vertical keiretsu activities adversely affects domestic productivity. Overall, our results suggest that the role of vertical keiretsu is more prevalent than that of horizontal keiretsu. Originality/value – Japan’s industrial landscape has been dominated by institutional clusters or networks of inter-firm organisations through reciprocated, direct and indirect ties. However, interactions between inward investors and such institutionalised networks in the host economy are seldom explored. The role and characteristics of local business groups, in the form of keiretsu networks, have been investigated to determine the scale and scope of spillovers from inward FDI to Japanese establishments. This conceptualisation depends on the institutional mechanism and the market structure through which host economies absorb and exploit FDI.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The compact and visualized documenting of information business modeling is a major prerequisite for comprehending its basic concepts, as well as for its effective application and improvement. The documenting of this process is related to its modeling. Thus, the process of information business modeling can be represented by its own tools. Being based on this thesis, the authors suggest an approach to representing the process of information business modeling. A profile for its documenting has been developed for the purpose.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We report observations and measurements of the inscription of fiber Bragg gratings (FBGs) in two different types of microstructured polymer optical fiber: few-mode and an endlessly single mode. Contrary to the FBG inscription in silica microstructured fiber, where high-energy laser pulses are a prerequisite, we have successfully used a low-power cw laser source operating at 325 nm to produce 1 cm long gratings with a reflection peak at 1570 nm. Peak reflectivities of more than 10% have been observed. © 2005 Optical Society of America.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Motivation: In molecular biology, molecular events describe observable alterations of biomolecules, such as binding of proteins or RNA production. These events might be responsible for drug reactions or development of certain diseases. As such, biomedical event extraction, the process of automatically detecting description of molecular interactions in research articles, attracted substantial research interest recently. Event trigger identification, detecting the words describing the event types, is a crucial and prerequisite step in the pipeline process of biomedical event extraction. Taking the event types as classes, event trigger identification can be viewed as a classification task. For each word in a sentence, a trained classifier predicts whether the word corresponds to an event type and which event type based on the context features. Therefore, a well-designed feature set with a good level of discrimination and generalization is crucial for the performance of event trigger identification. Results: In this article, we propose a novel framework for event trigger identification. In particular, we learn biomedical domain knowledge from a large text corpus built from Medline and embed it into word features using neural language modeling. The embedded features are then combined with the syntactic and semantic context features using the multiple kernel learning method. The combined feature set is used for training the event trigger classifier. Experimental results on the golden standard corpus show that >2.5% improvement on F-score is achieved by the proposed framework when compared with the state-of-the-art approach, demonstrating the effectiveness of the proposed framework. © 2014 The Author 2014. The source code for the proposed framework is freely available and can be downloaded at http://cse.seu.edu.cn/people/zhoudeyu/ETI_Sourcecode.zip.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cellular peptide vaccines contain T-cell epitopes. The main prerequisite for a peptide to act as a T-cell epitope is that it binds to a major histocompatibility complex (MHC) protein. Peptide MHC binder identification is an extremely costly experimental challenge since human MHCs, named human leukocyte antigen, are highly polymorphic and polygenic. Here we present EpiDOCK, the first structure-based server for MHC class II binding prediction. EpiDOCK predicts binding to the 23 most frequent human, MHC class II proteins. It identifies 90% of true binders and 76% of true non-binders, with an overall accuracy of 83%. EpiDOCK is freely accessible at http://epidock.ddg-pharmfac. net. © The Author 2013. Published by Oxford University Press. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Agile methodologies are becoming more popular in the software development process nowadays. The iterative development lifecycle, openness to frequent changes, tight cooperation with the client and among the software engineers are turning into more and more effective practices and respond to a higher extend to the current business needs. It is natural to raise the question which methodology is the most suitable for use when starting and managing a project. This depends on many factors—product characteristics, technologies used, client’s and developer’s experience, project type. A systematic analysis of the most common problems appearing when developing a particular type of projects—public portal solutions, is proposed. In the case at hand a very close interaction with various types of end users is observed. This is a prerequisite for permanent changes during the development and support cycles, which makes them ideal candidates for using an agile methodology. We will compare the ways in which each methodology addresses the specific problems arising and will finish with ranking them according to their relevance. This might help the project manager in choosing one or a combination of the methodologies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Az elektronikus hírközlő hálózat rohamszerű fejlesztésének igénye az elektronikus szolgáltatások széles körű elterjedésével az állami döntéshozókat is fejlesztéspolitikai koncepciók kidolgozására és azok végrehajtására ösztönzi. Az (információs) társadalom fejlődése és az ennek alapjául szolgáló infokommunikációs szolgáltatások használata alapvetően függ a szélessávú infrastruktúra fejlesztésétől, az elektronikus hírközlő hálózat elérésének lehetőségétől. Az állami szerepvállalási hajlandóság 2011-től kezdődően jelentősen megnőtt az elektronikus hírközlési területen. Az MVM NET Zrt. megalapítása, a NISZ Zrt. átszervezése, a GOP 3.1.2-es pályázat és a 4. mobilszolgáltató létrehozásának terve mind mutatják a kormányzat erőteljes szándékát a terület fejlesztésére. A tanulmányban bemutatásra kerül, hogy az állam milyen beavatkozási eszközökkel rendelkezik az elektronikus hírközlő hálózat fejlesztésének ösztönzésére. A szerző ezt követően a négy, jelentős állami beavatkozás elemzését végzi el annak vizsgálatára, hogy megfelelő alapozottsággal született-e döntés az állami szerepvállalásról. _____ With the widespread use of the Internet, the need for the rapid development of the digital communication networks has prompted government policy makers also to conceptualize and implement development policy. The advancement of the (information) society and the use of information communication technology as a prerequisite of it are fundamentally determined by the development of broadband infrastructure and whether broadband access to the digital telecommunication network is available. The propensity of the government to play a bigger role in the field of electronical communication has increased significantly from 2011. The setup of MVM NET Zrt. / Hungarian Electricity NET Ltd./, the realignment of NISZ Zrt. / National Info communication Services Company Limited by Shares - NISZ Ltd./, the GOP 3.1.2. tender and the plan to enable a new, i.e. the fourth mobile network operator to enter the market all indicate the robust intention of the government to develop this field. The study shows the tools of government intervention for the incentive of the development of the electronical communication network. Then the author analyses the four main government interventions to examine whether the decision on the role of the state was adequately well-founded.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The National Council Licensure Examination for Registered Nurses (NCLEX-RN) is the examination that all graduates of nursing education programs must pass to attain the title of registered nurse. Currently the NCLEX-RN passing rate is at an all-time low (81%) for first-time test takers (NCSBN, 2004); amidst a nationwide shortage of registered nurses (Glabman, 2001). Because of the critical need to supply greater numbers of professional nurses, and the potential accreditation ramifications that low NCLEX-RN passing rates can have on schools of nursing and graduates, this research study tests the effectiveness of a predictor model. This model is based upon the theoretical framework of McClusky's (1959) theory of margin (ToM), with the hope that students found to be at-risk for NCLEX-RN failure can be identified and remediated prior to taking the actual licensure examination. To date no theory based predictor model has been identified that predicts success on the NCLEX-RN. ^ The model was tested using prerequisite course grades, nursing course grades and scores on standardized examinations for the 2003 associate degree nursing graduates at a urban community college (N = 235). Success was determined through the reporting of pass on the NCLEX-RN examination by the Florida Board of Nursing. Point biserial correlations tested model assumptions regarding variable relationships, while logistic regression was used to test the model's predictive power. ^ Correlations among variables were significant and the model accounted for 66% of variance in graduates' success on the NCLEX-RN with 98% prediction accuracy. Although certain prerequisite course grades and nursing course grades were found to be significant to NCLEX-RN success, the overall model was found to be most predictive at the conclusion of the academic program of study. The inclusion of the RN Assessment Examination, taken during the final semester of course work, was the most significant predictor of NCLEX-RN success. Success on the NCLEX-RN allows graduates to work as registered nurses, reflects positively on a school's academic performance record, and supports the appropriateness of the educational program's goals and objectives. The study's findings support potential other uses of McClusky's theory of margin as a predictor of program outcome in other venues of adult education. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work is the first work using patterned soft underlayers in multilevel three-dimensional vertical magnetic data storage systems. The motivation stems from an exponentially growing information stockpile, and a corresponding need for more efficient storage devices with higher density. The world information stockpile currently exceeds 150EB (ExaByte=1x1018Bytes); most of which is in analog form. Among the storage technologies (semiconductor, optical and magnetic), magnetic hard disk drives are posed to occupy a big role in personal, network as well as corporate storage. However; this mode suffers from a limit known as the Superparamagnetic limit; which limits achievable areal density due to fundamental quantum mechanical stability requirements. There are many viable techniques considered to defer superparamagnetism into the 100's of Gbit/in2 such as: patterned media, Heat-Assisted Magnetic Recording (HAMR), Self Organized Magnetic Arrays (SOMA), antiferromagnetically coupled structures (AFC), and perpendicular magnetic recording. Nonetheless, these techniques utilize a single magnetic layer; and can thusly be viewed as two-dimensional in nature. In this work a novel three-dimensional vertical magnetic recording approach is proposed. This approach utilizes the entire thickness of a magnetic multilayer structure to store information; with potential areal density well into the Tbit/in2 regime. ^ There are several possible implementations for 3D magnetic recording; each presenting its own set of requirements, merits and challenges. The issues and considerations pertaining to the development of such systems will be examined, and analyzed using empirical and numerical analysis techniques. Two novel key approaches are proposed and developed: (1) Patterned soft underlayer (SUL) which allows for enhanced recording of thicker media, (2) A combinatorial approach for 3D media development that facilitates concurrent investigation of various film parameters on a predefined performance metric. A case study is presented using combinatorial overcoats of Tantalum and Zirconium Oxides for corrosion protection in magnetic media. ^ Feasibility of 3D recording is demonstrated, and an emphasis on 3D media development is emphasized as a key prerequisite. Patterned SUL shows significant enhancement over conventional "un-patterned" SUL, and shows that geometry can be used as a design tool to achieve favorable field distribution where magnetic storage and magnetic phenomena are involved. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study investigated the relation of several predictors to high school dropout. The data, composed of records from a cohort of students ( N = 10,100) who entered ninth grade in 2001, were analyzed via logistic regression. The predictor variables were: (a) Algebra I grade, (b) Florida Comprehensive Assessment Test (FCAT) level, (c) language proficiency, (d) gender, (e) race/ethnicity, (f) Exceptional Student Education program membership, and (g) socio-economic status. The criterion was graduation status: graduated or dropped out. Algebra I grades were an important predictor of whether students drop out or graduate; students who failed this course were 4.1 times more likely to drop out than those who passed the course. Other significant predictors of high school dropout were language proficiency, Florida Comprehensive Assessment Test (FCAT) level, gender, and socio-economic status. The main focus of the study was on Algebra I as a predictor, but the study was not designed to discover the specific factors related to or underlying success in this course. Nevertheless, because Algebra I may be considered an important prerequisite for other major facets of the curriculum and because of its high relationship to high school dropout, a recommendation emerging from these findings is that districts address the issue of preventing failure in this course. Adequate support mechanisms for improving retention include addressing the students' readiness for enrolling in mathematics courses as well as curriculum improvements that enhance student readiness through such processes as remediation. Assuring that mathematics instruction is monitored and improved and that remedial programs are in place to facilitate content learning in all subjects for all students, but especially for those having limited English proficiency, are critical educational responsibilities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We presented a unique case of a high school athlete who suffered from a coracoid process fracture following a collision with an opposing player. This fracture is commonly misdiagnosed as a clavicular fracture or AC joint sprain. Initial radiographic examination may fail to identify the fracture site. Understanding the clinical features of this injury is an important prerequisite to its overall management. Any misdiagnosis or alteration from the appropriate course of treatment can inhibit return to play and may be avoided by using indicated diagnostic evaluation tools.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.