974 resultados para Single-chain variable fragments


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fine copy of al-Būṣīrī's poem in praise of the Prophet accompanied by elucidation in Persian and Turkish.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

"Counterinsurgency (COIN) requires an integrated military, political, and economic program best developed by teams that field both civilians and soldiers. These units should operate with some independence but under a coherent command. In Vietnam, after several false starts, the United States developed an effective unified organization, Civil Operations and Revolutionary Development Support (CORDS), to guide the counterinsurgency. CORDS had three components absent from our efforts in Afghanistan today: sufficient personnel (particularly civilian), numerous teams, and a single chain of command that united the separate COIN programs of the disparate American departments at the district, provincial, regional, and national levels. This paper focuses on the third issue and describes the benefits that unity of command at every level would bring to the American war in Afghanistan. The work begins with a brief introduction to counterinsurgency theory, using a population-centric model, and examines how this warfare challenges the United States. It traces the evolution of the Provincial Reconstruction Teams (PRTs) and the country team, describing problems at both levels. Similar efforts in Vietnam are compared, where persistent executive attention finally integrated the government's counterinsurgency campaign under the unified command of the CORDS program. The next section attributes the American tendency towards a segregated response to cultural differences between the primary departments, executive neglect, and societal concepts of war. The paper argues that, in its approach to COIN, the United States has forsaken the military concept of unity of command in favor of 'unity of effort' expressed in multiagency literature. The final sections describe how unified authority would improve our efforts in Afghanistan and propose a model for the future."--P. iii.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Flows of complex fluids need to be understood at both macroscopic and molecular scales, because it is the macroscopic response that controls the fluid behavior, but the molecular scale that ultimately gives rise to rheological and solid-state properties. Here the flow field of an entangled polymer melt through an extended contraction, typical of many polymer processes, is imaged optically and by small-angle neutron scattering. The dual-probe technique samples both the macroscopic stress field in the flow and the microscopic configuration of the polymer molecules at selected points. The results are compared with a recent tube model molecular theory of entangled melt flow that is able to calculate both the stress and the single-chain structure factor from first principles. The combined action of the three fundamental entangled processes of reptation, contour length fluctuation, and convective constraint release is essential to account quantitatively for the rich rheological behavior. The multiscale approach unearths a new feature: Orientation at the length scale of the entire chain decays considerably more slowly than at the smaller entanglement length.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is an urgent need for high purity, single chain, fully functional Eph/ephrin membrane proteins. This report outlines the pTIg-BOS-Fc vector and purification approach resulting in rapid increased production of fully functional single chain extracellular proteins that were isolated with high purity and used in structure-function analysis and pre-clinical studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Developing the social identity theory of leadership (e.g., [Hogg, M. A. (2001). A social identity theory of leadership. Personality and Social Psychology Review, 5, 184-200]), an experiment (N=257) tested the hypothesis that as group members identify more strongly with their group (salience) their evaluations of leadership effectiveness become more strongly influenced by the extent to which their demographic stereotype-based impressions of their leader match the norm of the group (prototypicality). Participants, with more or less traditional gender attitudes (orientation), were members, under high or low group salience conditions (salience), of non-interactive laboratory groups that had instrumental or expressive group norms (norm), and a male or female leader (leader gender). As predicted, these four variables interacted significantly to affect perceptions of leadership effectiveness. Reconfiguration of the eight conditions formed by orientation, norm and leader gender produced a single prototypicality variable. Irrespective of participant gender, prototypical leaders were considered more effective in high then low salience groups, and in high salience groups prototypical leaders were more effective than less prototypical leaders. Alternative explanations based on status characteristics and role incongruity theory do not account well for the findings. Implications of these results for the glass ceiling effect and for a wider social identity analysis of the impact of demographic group membership on leadership in small groups are discussed. (c) 2006 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In Statnotes 24 and 25, multiple linear regression, a statistical method that examines the relationship between a single dependent variable (Y) and two or more independent variables (X), was described. The principle objective of such an analysis was to determine which of the X variables had a significant influence on Y and to construct an equation that predicts Y from the X variables. ‘Principal components analysis’ (PCA) and ‘factor analysis’ (FA) are also methods of examining the relationships between different variables but they differ from multiple regression in that no distinction is made between the dependent and independent variables, all variables being essentially treated the same. Originally, PCA and FA were regarded as distinct methods but in recent times they have been combined into a single analysis, PCA often being the first stage of a FA. The basic objective of a PCA/FA is to examine the relationships between the variables or the ‘structure’ of the variables and to determine whether these relationships can be explained by a smaller number of ‘factors’. This statnote describes the use of PCA/FA in the analysis of the differences between the DNA profiles of different MRSA strains introduced in Statnote 26.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this project was to carry out an investigastion into suitable alternatives to gasoline for use in modern automobiles. The fuel would provide the western world with a means of extending the natural gasoline resources and the third world a way of cutting down their dependence on the oil producing countries for their energy supply. Alcohols, namely methanol and ethanol, provide this solution. They can be used as gasoline extenders or as fuels on their own.In order to fulfil the aims of the project a literature study was carried out to investigate methods and costs of producing these fuels. An experimental programme was then set up in which the performance of the alcohols was studied on a conventional engine. The engine used for this purpose was the Fiat 127 930cc four cylinder engine. This engine was used because of its popularity in the European countries. The Weber fixed jet carburettor, since it was designed to be used with gasoline, was adapted so that the alcohol fuels and the blends could be used in the most efficient way. This was mainly to take account of the lower heat content of the alcohols. The adaptation of the carburettor was in the form of enlarging the main metering jet. Allowances for the alcohol's lower specfic gravity were made during fuel metering.Owing to the low front end volatility of methanol and ethanol, it was expected that `start up' problems would occur. An experimental programme was set up to determine the temperature range for a minimum required percentage `take off' that would ease start-up since it was determined that a `take off' of about 5% v/v liquid in the vapour phase would be sufficient for starting. Additions such as iso-pentane and n-pentane were used to improve the front end volatility. This proved to be successful.The lower heat content of the alcohol fuels also meant that a greater charge of fuel would be required. This was seen to pose further problems with fuel distribution from the carburettor to the individual cylinders on a multicylinder engine. Since it was not possible to modify the existing manifold on the Fiat 127 engine, experimental tests on manifold geometry were carried out using the Ricardo E6 single cylinder variable compression engine. Results from these tests showed that the length, shape and cross-sectional area of the manifold play an important part in the distribution of the fuel entering the cylinder, ie. vapour phase, vapour/small liquid droplet/liquid film phase, vapour/large liquid droplet/liquid film phase etc.The solvent properties of the alcohols and their greater electrical conductivity suggested that the materials used on the engine would be prone to chemical attack. In order to determine the type and rate of chemical attack, an experimental programme was set up whereby carburettor and other components were immersed in the alcohols and in blends of alcohol with gasoline. The test fuels were aerated and in some instances kept at temperatures ranging from 50oC to 90oC. Results from these tests suggest that not all materials used in the conventional engine are equally suitable for use with alcohols and alcohol/gasoline blends. Aluminium for instance was severely attacked by methanol causing pitting and pin-holing in the surface.In general this whole experimental programme gave valuable information on the acceptability of substitute fuels. While the long term effects of alcohol use merit further study, it is clear that methanol and ethanol will be increasingly used in place of gasoline.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this work is to investigate the various parameters that could control the encapsulation of lipophilic drugs and investigate the influence of the physical properties of poorly water-soluble drugs on bilayer loading. Initial work investigated on the solubilisation of ibuprofen, a model insoluble drug. Drug loading was assessed using HPLC and UV spectrophotometric analysis. Preliminary studies focused on the influence of bilayer composition on drug loading to obtain an optimum cholesterol concentration. This was followed up by studies investigating the effect of longer alkyl chain lipids, unsaturated alkyl chain lipids and charged lipids. The studies also focused on the effects of pH of the hydration medium and addition of the single chain surfactant a-tocopherol. The work was followed up by investigation of a range of insoluble drugs including flurbiprofen, indomethacin, sulindac, mefenamic acid, lignocaine and progesterone to investigate the influence of drugs properties and functional group on liposomal loading. The results show that no defined trend could be obtained linking the drug loading to the different drug properties including molecular weight, log P and other drug specific characteristics. However, the presence of the oppositely charged lipids improved the encapsulation of all the drugs investigated with a similar effect obtained with the substitution of the longer chain lipids. The addition of the single chain surfactant a-tocopherol resulted in enhancement of drug loading and possibly is governed by the log P of the drug candidate. Environmental scanning-electron microscopy (ESEM) was used to dynamically follow the changes in liposome morphology in real time during dehydration thereby providing a alternative assay of liposome formulation and stability. The ESEM analysis clearly demonstrated ibuprofen incorporation enhanced the stability of PC:Chol liposomes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of the multiple indicators, multiple causes model to operationalize formative variables (the formative MIMIC model) is advocated in the methodological literature. Yet, contrary to popular belief, the formative MIMIC model does not provide a valid method of integrating formative variables into empirical studies and we recommend discarding it from formative models. Our arguments rest on the following observations. First, much formative variable literature appears to conceptualize a causal structure between the formative variable and its indicators which can be tested or estimated. We demonstrate that this assumption is illogical, that a formative variable is simply a researcher-defined composite of sub-dimensions, and that such tests and estimates are unnecessary. Second, despite this, researchers often use the formative MIMIC model as a means to include formative variables in their models and to estimate the magnitude of linkages between formative variables and their indicators. However, the formative MIMIC model cannot provide this information since it is simply a model in which a common factor is predicted by some exogenous variables—the model does not integrate within it a formative variable. Empirical results from such studies need reassessing, since their interpretation may lead to inaccurate theoretical insights and the development of untested recommendations to managers. Finally, the use of the formative MIMIC model can foster fuzzy conceptualizations of variables, particularly since it can erroneously encourage the view that a single focal variable is measured with formative and reflective indicators. We explain these interlinked arguments in more detail and provide a set of recommendations for researchers to consider when dealing with formative variables.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Concurrent software executes multiple threads or processes to achieve high performance. However, concurrency results in a huge number of different system behaviors that are difficult to test and verify. The aim of this dissertation is to develop new methods and tools for modeling and analyzing concurrent software systems at design and code levels. This dissertation consists of several related results. First, a formal model of Mondex, an electronic purse system, is built using Petri nets from user requirements, which is formally verified using model checking. Second, Petri nets models are automatically mined from the event traces generated from scientific workflows. Third, partial order models are automatically extracted from some instrumented concurrent program execution, and potential atomicity violation bugs are automatically verified based on the partial order models using model checking. Our formal specification and verification of Mondex have contributed to the world wide effort in developing a verified software repository. Our method to mine Petri net models automatically from provenance offers a new approach to build scientific workflows. Our dynamic prediction tool, named McPatom, can predict several known bugs in real world systems including one that evades several other existing tools. McPatom is efficient and scalable as it takes advantage of the nature of atomicity violations and considers only a pair of threads and accesses to a single shared variable at one time. However, predictive tools need to consider the tradeoffs between precision and coverage. Based on McPatom, this dissertation presents two methods for improving the coverage and precision of atomicity violation predictions: 1) a post-prediction analysis method to increase coverage while ensuring precision; 2) a follow-up replaying method to further increase coverage. Both methods are implemented in a completely automatic tool.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fungal ribotoxins that block protein synthesis can be useful warheads in the context of a targeted immunotoxin. α-Sarcin is a small (17 kDa) fungal ribonuclease produced by Aspergillus giganteus that functions by catalytically cleaving a single phosphodiester bond in the sarcin–ricin loop of the large ribosomal subunit, thus making the ribosome unrecognisable to elongation factors and leading to inhibition of protein synthesis. Peptide mapping using an ex vivo human T cell assay determined that α-sarcin contained two T cell epitopes; one in the N-terminal 20 amino acids and the other in the C-terminal 20 amino acids. Various mutations were tested individually within each epitope and then in combination to isolate deimmunised α-sarcin variants that had the desired properties of silencing T cell epitopes and retention of the ability to inhibit protein synthesis (equivalent to wild-type, WT α-sarcin). A deimmunised variant (D9T/Q142T) demonstrated a complete lack of T cell activation in in vitro whole protein human T cell assays using peripheral blood mononuclear cells from donors with diverse HLA allotypes. Generation of an immunotoxin by fusion of the D9T/Q142T variant to a single-chain Fv targeting Her2 demonstrated potent cell killing equivalent to a fusion protein comprising the WT α-sarcin. These results represent the first fungal ribotoxin to be deimmunised with the potential to construct a new generation of deimmunised immunotoxin therapeutics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: The prognostic significance of ATM mutations in chronic lymphocytic leukemia (CLL) is unclear. We assessed their impact in the context of a prospective randomized trial. PATIENTS AND METHODS: We analyzed the ATM gene in 224 patients treated on the Leukemia Research Fund Chronic Lymphocytic Leukemia 4 (LRF-CLL4) trial with chlorambucil or fludarabine with and without cyclophosphamide. ATM status was analyzed by denaturing high-performance liquid chromatography and was related to treatment response, survival, and the impact of TP53 alterations for the same patient cohort. RESULTS: We identified 36 ATM mutations in 33 tumors, 16 with and 17 without 11q deletion. Mutations were associated with advanced disease stage and involvement of multiple lymphoid sites. Patients with both ATM mutation and 11q deletion showed significantly reduced progression-free survival (median, 7.4 months) compared with those with ATM wild type (28.6 months), 11q deletion alone (17.1 months), or ATM mutation alone (30.8 months), but survival was similar to that in patients with monoallelic (6.7 months) or biallelic (3.4 months) TP53 alterations. This effect was independent of treatment, immunoglobulin heavy chain variable gene (IGHV) status, age, sex, or disease stage. Overall survival for patients with biallelic ATM alterations was also significantly reduced compared with those with ATM wild type or ATM mutation alone (median, 42.2 v 85.5 v 77.6 months, respectively). CONCLUSION: The combination of 11q deletion and ATM mutation in CLL is associated with significantly shorter progression-free and overall survival following first-line treatment with alkylating agents and purine analogs. Assessment of ATM mutation status in patients with 11q deletion may influence the choice of subsequent therapy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

B7-H4 (VTCN1, B7x, B7s) is an inhibitory modulator of T-cell response implicated in antigen tolerization. As such, B7-H4 is an immune checkpoint of potential therapeutic interest. To generate anti-B7-H4 targeting reagents, we isolated antibodies by differential cell screening of a yeast-display library of recombinant antibodies (scFvs) derived from ovarian cancer patients and we screened for functional scFvs capable to interfere with B7-H4-mediated inhibition of antitumor responses. We found one antibody binding to B7-H4 that could restore antitumor T cell responses. This chapter gives an overview of the methods we developed to isolate a functional anti-B7-H4 antibody fragment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Concurrent software executes multiple threads or processes to achieve high performance. However, concurrency results in a huge number of different system behaviors that are difficult to test and verify. The aim of this dissertation is to develop new methods and tools for modeling and analyzing concurrent software systems at design and code levels. This dissertation consists of several related results. First, a formal model of Mondex, an electronic purse system, is built using Petri nets from user requirements, which is formally verified using model checking. Second, Petri nets models are automatically mined from the event traces generated from scientific workflows. Third, partial order models are automatically extracted from some instrumented concurrent program execution, and potential atomicity violation bugs are automatically verified based on the partial order models using model checking. Our formal specification and verification of Mondex have contributed to the world wide effort in developing a verified software repository. Our method to mine Petri net models automatically from provenance offers a new approach to build scientific workflows. Our dynamic prediction tool, named McPatom, can predict several known bugs in real world systems including one that evades several other existing tools. McPatom is efficient and scalable as it takes advantage of the nature of atomicity violations and considers only a pair of threads and accesses to a single shared variable at one time. However, predictive tools need to consider the tradeoffs between precision and coverage. Based on McPatom, this dissertation presents two methods for improving the coverage and precision of atomicity violation predictions: 1) a post-prediction analysis method to increase coverage while ensuring precision; 2) a follow-up replaying method to further increase coverage. Both methods are implemented in a completely automatic tool.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Our goal in this paper is to extend previous results obtained for Newtonian and secondgrade fluids to third-grade fluids in the case of an axisymmetric, straight, rigid and impermeable tube with constant cross-section using a one-dimensional hierarchical model based on the Cosserat theory related to fluid dynamics. In this way we can reduce the full threedimensional system of equations for the axisymmetric unsteady motion of a non-Newtonian incompressible third-grade fluid to a system of equations depending on time and on a single spatial variable. Some numerical simulations for the volume flow rate and the the wall shear stress are presented.